00:00:00.000 Started by upstream project "autotest-per-patch" build number 126152 00:00:00.000 originally caused by: 00:00:00.000 Started by user sys_sgci 00:00:00.030 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.031 The recommended git tool is: git 00:00:00.031 using credential 00000000-0000-0000-0000-000000000002 00:00:00.032 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.051 Fetching changes from the remote Git repository 00:00:00.054 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.086 Using shallow fetch with depth 1 00:00:00.086 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.086 > git --version # timeout=10 00:00:00.140 > git --version # 'git version 2.39.2' 00:00:00.140 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.189 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.189 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.481 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.495 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.507 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:03.507 > git config core.sparsecheckout # timeout=10 00:00:03.518 > git read-tree -mu HEAD # timeout=10 00:00:03.536 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:03.557 Commit message: "inventory: add WCP3 to free inventory" 00:00:03.557 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:03.663 [Pipeline] Start of Pipeline 00:00:03.679 [Pipeline] library 00:00:03.680 Loading library shm_lib@master 00:00:03.680 Library shm_lib@master is cached. Copying from home. 00:00:03.696 [Pipeline] node 00:00:03.716 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.717 [Pipeline] { 00:00:03.727 [Pipeline] catchError 00:00:03.728 [Pipeline] { 00:00:03.739 [Pipeline] wrap 00:00:03.746 [Pipeline] { 00:00:03.753 [Pipeline] stage 00:00:03.756 [Pipeline] { (Prologue) 00:00:03.943 [Pipeline] sh 00:00:04.223 + logger -p user.info -t JENKINS-CI 00:00:04.251 [Pipeline] echo 00:00:04.253 Node: WFP50 00:00:04.260 [Pipeline] sh 00:00:04.542 [Pipeline] setCustomBuildProperty 00:00:04.553 [Pipeline] echo 00:00:04.555 Cleanup processes 00:00:04.560 [Pipeline] sh 00:00:04.837 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.837 4120093 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.849 [Pipeline] sh 00:00:05.125 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:05.125 ++ grep -v 'sudo pgrep' 00:00:05.125 ++ awk '{print $1}' 00:00:05.125 + sudo kill -9 00:00:05.125 + true 00:00:05.138 [Pipeline] cleanWs 00:00:05.146 [WS-CLEANUP] Deleting project workspace... 00:00:05.146 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.152 [WS-CLEANUP] done 00:00:05.155 [Pipeline] setCustomBuildProperty 00:00:05.169 [Pipeline] sh 00:00:05.446 + sudo git config --global --replace-all safe.directory '*' 00:00:05.509 [Pipeline] httpRequest 00:00:05.528 [Pipeline] echo 00:00:05.530 Sorcerer 10.211.164.101 is alive 00:00:05.536 [Pipeline] httpRequest 00:00:05.540 HttpMethod: GET 00:00:05.541 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.541 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.542 Response Code: HTTP/1.1 200 OK 00:00:05.543 Success: Status code 200 is in the accepted range: 200,404 00:00:05.543 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:06.383 [Pipeline] sh 00:00:06.659 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:06.932 [Pipeline] httpRequest 00:00:06.966 [Pipeline] echo 00:00:06.967 Sorcerer 10.211.164.101 is alive 00:00:06.973 [Pipeline] httpRequest 00:00:06.977 HttpMethod: GET 00:00:06.977 URL: http://10.211.164.101/packages/spdk_4835eb82bb1be9e262aefa045af927257ebac260.tar.gz 00:00:06.978 Sending request to url: http://10.211.164.101/packages/spdk_4835eb82bb1be9e262aefa045af927257ebac260.tar.gz 00:00:06.994 Response Code: HTTP/1.1 200 OK 00:00:06.994 Success: Status code 200 is in the accepted range: 200,404 00:00:06.995 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_4835eb82bb1be9e262aefa045af927257ebac260.tar.gz 00:01:02.011 [Pipeline] sh 00:01:02.291 + tar --no-same-owner -xf spdk_4835eb82bb1be9e262aefa045af927257ebac260.tar.gz 00:01:06.481 [Pipeline] sh 00:01:06.762 + git -C spdk log --oneline -n5 00:01:06.762 4835eb82b nvmf: consolidate listener addition in avahi_entry_group_add_listeners 00:01:06.762 719d03c6a sock/uring: only register net impl if supported 00:01:06.762 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:01:06.762 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:01:06.762 6c7c1f57e accel: add sequence outstanding stat 00:01:06.775 [Pipeline] } 00:01:06.793 [Pipeline] // stage 00:01:06.803 [Pipeline] stage 00:01:06.805 [Pipeline] { (Prepare) 00:01:06.825 [Pipeline] writeFile 00:01:06.842 [Pipeline] sh 00:01:07.123 + logger -p user.info -t JENKINS-CI 00:01:07.136 [Pipeline] sh 00:01:07.417 + logger -p user.info -t JENKINS-CI 00:01:07.429 [Pipeline] sh 00:01:07.709 + cat autorun-spdk.conf 00:01:07.710 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.710 SPDK_TEST_BLOCKDEV=1 00:01:07.710 SPDK_TEST_ISAL=1 00:01:07.710 SPDK_TEST_CRYPTO=1 00:01:07.710 SPDK_TEST_REDUCE=1 00:01:07.710 SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.710 SPDK_RUN_UBSAN=1 00:01:07.716 RUN_NIGHTLY=0 00:01:07.722 [Pipeline] readFile 00:01:07.751 [Pipeline] withEnv 00:01:07.753 [Pipeline] { 00:01:07.768 [Pipeline] sh 00:01:08.050 + set -ex 00:01:08.050 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:08.050 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:08.050 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:08.050 ++ SPDK_TEST_BLOCKDEV=1 00:01:08.050 ++ SPDK_TEST_ISAL=1 00:01:08.050 ++ SPDK_TEST_CRYPTO=1 00:01:08.050 ++ SPDK_TEST_REDUCE=1 00:01:08.050 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:08.050 ++ SPDK_RUN_UBSAN=1 00:01:08.050 ++ RUN_NIGHTLY=0 00:01:08.050 + case $SPDK_TEST_NVMF_NICS in 00:01:08.050 + DRIVERS= 00:01:08.050 + [[ -n '' ]] 00:01:08.050 + exit 0 00:01:08.059 [Pipeline] } 00:01:08.079 [Pipeline] // withEnv 00:01:08.084 [Pipeline] } 00:01:08.103 [Pipeline] // stage 00:01:08.113 [Pipeline] catchError 00:01:08.115 [Pipeline] { 00:01:08.131 [Pipeline] timeout 00:01:08.131 Timeout set to expire in 40 min 00:01:08.133 [Pipeline] { 00:01:08.146 [Pipeline] stage 00:01:08.149 [Pipeline] { (Tests) 00:01:08.163 [Pipeline] sh 00:01:08.443 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:08.443 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:08.443 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:08.443 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:08.443 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:08.443 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.443 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:08.443 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.443 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.443 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.443 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:08.443 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:08.443 + source /etc/os-release 00:01:08.443 ++ NAME='Fedora Linux' 00:01:08.443 ++ VERSION='38 (Cloud Edition)' 00:01:08.443 ++ ID=fedora 00:01:08.443 ++ VERSION_ID=38 00:01:08.443 ++ VERSION_CODENAME= 00:01:08.443 ++ PLATFORM_ID=platform:f38 00:01:08.443 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:08.443 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:08.443 ++ LOGO=fedora-logo-icon 00:01:08.443 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:08.443 ++ HOME_URL=https://fedoraproject.org/ 00:01:08.443 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:08.443 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:08.443 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:08.443 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:08.443 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:08.443 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:08.443 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:08.443 ++ SUPPORT_END=2024-05-14 00:01:08.443 ++ VARIANT='Cloud Edition' 00:01:08.443 ++ VARIANT_ID=cloud 00:01:08.443 + uname -a 00:01:08.443 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:08.443 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:11.748 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:11.748 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:11.748 Hugepages 00:01:11.748 node hugesize free / total 00:01:11.748 node0 1048576kB 0 / 0 00:01:11.748 node0 2048kB 0 / 0 00:01:11.748 node1 1048576kB 0 / 0 00:01:11.748 node1 2048kB 0 / 0 00:01:11.748 00:01:11.748 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:11.748 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:11.748 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:11.748 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:11.748 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:11.748 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:11.749 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:11.749 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:11.749 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:11.749 + rm -f /tmp/spdk-ld-path 00:01:11.749 + source autorun-spdk.conf 00:01:11.749 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.749 ++ SPDK_TEST_BLOCKDEV=1 00:01:11.749 ++ SPDK_TEST_ISAL=1 00:01:11.749 ++ SPDK_TEST_CRYPTO=1 00:01:11.749 ++ SPDK_TEST_REDUCE=1 00:01:11.749 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.749 ++ SPDK_RUN_UBSAN=1 00:01:11.749 ++ RUN_NIGHTLY=0 00:01:11.749 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:11.749 + [[ -n '' ]] 00:01:11.749 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:11.749 + for M in /var/spdk/build-*-manifest.txt 00:01:11.749 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:11.749 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.749 + for M in /var/spdk/build-*-manifest.txt 00:01:11.749 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:11.749 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.749 ++ uname 00:01:11.749 + [[ Linux == \L\i\n\u\x ]] 00:01:11.749 + sudo dmesg -T 00:01:11.749 + sudo dmesg --clear 00:01:11.749 + dmesg_pid=4121066 00:01:11.749 + [[ Fedora Linux == FreeBSD ]] 00:01:11.749 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.749 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.749 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:11.749 + [[ -x /usr/src/fio-static/fio ]] 00:01:11.749 + export FIO_BIN=/usr/src/fio-static/fio 00:01:11.749 + FIO_BIN=/usr/src/fio-static/fio 00:01:11.749 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:11.749 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:11.749 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:11.749 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.749 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.749 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:11.749 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.749 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.749 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:11.749 + sudo dmesg -Tw 00:01:11.749 Test configuration: 00:01:11.749 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.749 SPDK_TEST_BLOCKDEV=1 00:01:11.749 SPDK_TEST_ISAL=1 00:01:11.749 SPDK_TEST_CRYPTO=1 00:01:11.749 SPDK_TEST_REDUCE=1 00:01:11.749 SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.749 SPDK_RUN_UBSAN=1 00:01:11.749 RUN_NIGHTLY=0 09:05:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:11.749 09:05:20 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:11.749 09:05:20 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:11.749 09:05:20 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:11.749 09:05:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.749 09:05:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.749 09:05:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.749 09:05:20 -- paths/export.sh@5 -- $ export PATH 00:01:11.749 09:05:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.749 09:05:20 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:11.749 09:05:20 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:11.749 09:05:20 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721027120.XXXXXX 00:01:11.749 09:05:20 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721027120.TK6SuX 00:01:11.749 09:05:20 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:11.749 09:05:20 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:11.749 09:05:20 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:11.749 09:05:20 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:11.749 09:05:20 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:11.749 09:05:20 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:11.749 09:05:20 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:11.749 09:05:20 -- common/autotest_common.sh@10 -- $ set +x 00:01:11.749 09:05:20 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:11.749 09:05:20 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:11.749 09:05:20 -- pm/common@17 -- $ local monitor 00:01:11.749 09:05:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.749 09:05:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.749 09:05:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.749 09:05:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.749 09:05:20 -- pm/common@25 -- $ sleep 1 00:01:11.749 09:05:20 -- pm/common@21 -- $ date +%s 00:01:11.749 09:05:20 -- pm/common@21 -- $ date +%s 00:01:11.749 09:05:20 -- pm/common@21 -- $ date +%s 00:01:11.749 09:05:20 -- pm/common@21 -- $ date +%s 00:01:11.749 09:05:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721027120 00:01:11.749 09:05:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721027120 00:01:11.749 09:05:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721027120 00:01:11.749 09:05:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721027120 00:01:11.749 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721027120_collect-vmstat.pm.log 00:01:11.749 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721027120_collect-cpu-load.pm.log 00:01:11.749 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721027120_collect-cpu-temp.pm.log 00:01:11.749 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721027120_collect-bmc-pm.bmc.pm.log 00:01:12.685 09:05:21 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:12.685 09:05:21 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:12.685 09:05:21 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:12.685 09:05:21 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.685 09:05:21 -- spdk/autobuild.sh@16 -- $ date -u 00:01:12.685 Mon Jul 15 07:05:21 AM UTC 2024 00:01:12.685 09:05:21 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:12.685 v24.09-pre-203-g4835eb82b 00:01:12.685 09:05:21 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:12.685 09:05:21 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:12.685 09:05:21 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:12.685 09:05:21 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:12.685 09:05:21 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:12.685 09:05:21 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.685 ************************************ 00:01:12.685 START TEST ubsan 00:01:12.685 ************************************ 00:01:12.685 09:05:21 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:12.685 using ubsan 00:01:12.685 00:01:12.685 real 0m0.000s 00:01:12.685 user 0m0.000s 00:01:12.685 sys 0m0.000s 00:01:12.685 09:05:21 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:12.685 09:05:21 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:12.685 ************************************ 00:01:12.685 END TEST ubsan 00:01:12.685 ************************************ 00:01:12.685 09:05:21 -- common/autotest_common.sh@1142 -- $ return 0 00:01:12.685 09:05:21 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:12.685 09:05:21 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:12.685 09:05:21 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:12.685 09:05:21 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:12.943 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:12.943 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:13.201 Using 'verbs' RDMA provider 00:01:29.442 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:44.312 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:44.313 Creating mk/config.mk...done. 00:01:44.313 Creating mk/cc.flags.mk...done. 00:01:44.313 Type 'make' to build. 00:01:44.313 09:05:51 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:44.313 09:05:51 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:44.313 09:05:51 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:44.313 09:05:51 -- common/autotest_common.sh@10 -- $ set +x 00:01:44.313 ************************************ 00:01:44.313 START TEST make 00:01:44.313 ************************************ 00:01:44.313 09:05:51 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:44.313 make[1]: Nothing to be done for 'all'. 00:02:23.039 The Meson build system 00:02:23.039 Version: 1.3.1 00:02:23.039 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:23.039 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:23.039 Build type: native build 00:02:23.039 Program cat found: YES (/usr/bin/cat) 00:02:23.039 Project name: DPDK 00:02:23.039 Project version: 24.03.0 00:02:23.039 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:23.039 C linker for the host machine: cc ld.bfd 2.39-16 00:02:23.039 Host machine cpu family: x86_64 00:02:23.039 Host machine cpu: x86_64 00:02:23.039 Message: ## Building in Developer Mode ## 00:02:23.039 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:23.039 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:23.039 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:23.039 Program python3 found: YES (/usr/bin/python3) 00:02:23.039 Program cat found: YES (/usr/bin/cat) 00:02:23.039 Compiler for C supports arguments -march=native: YES 00:02:23.039 Checking for size of "void *" : 8 00:02:23.039 Checking for size of "void *" : 8 (cached) 00:02:23.039 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:23.039 Library m found: YES 00:02:23.039 Library numa found: YES 00:02:23.039 Has header "numaif.h" : YES 00:02:23.039 Library fdt found: NO 00:02:23.039 Library execinfo found: NO 00:02:23.039 Has header "execinfo.h" : YES 00:02:23.039 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:23.039 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:23.039 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:23.039 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:23.039 Run-time dependency openssl found: YES 3.0.9 00:02:23.039 Run-time dependency libpcap found: YES 1.10.4 00:02:23.039 Has header "pcap.h" with dependency libpcap: YES 00:02:23.039 Compiler for C supports arguments -Wcast-qual: YES 00:02:23.039 Compiler for C supports arguments -Wdeprecated: YES 00:02:23.039 Compiler for C supports arguments -Wformat: YES 00:02:23.039 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:23.039 Compiler for C supports arguments -Wformat-security: NO 00:02:23.039 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:23.039 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:23.039 Compiler for C supports arguments -Wnested-externs: YES 00:02:23.039 Compiler for C supports arguments -Wold-style-definition: YES 00:02:23.039 Compiler for C supports arguments -Wpointer-arith: YES 00:02:23.039 Compiler for C supports arguments -Wsign-compare: YES 00:02:23.039 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:23.039 Compiler for C supports arguments -Wundef: YES 00:02:23.039 Compiler for C supports arguments -Wwrite-strings: YES 00:02:23.039 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:23.039 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:23.039 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:23.039 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:23.039 Program objdump found: YES (/usr/bin/objdump) 00:02:23.039 Compiler for C supports arguments -mavx512f: YES 00:02:23.039 Checking if "AVX512 checking" compiles: YES 00:02:23.039 Fetching value of define "__SSE4_2__" : 1 00:02:23.039 Fetching value of define "__AES__" : 1 00:02:23.039 Fetching value of define "__AVX__" : 1 00:02:23.039 Fetching value of define "__AVX2__" : 1 00:02:23.039 Fetching value of define "__AVX512BW__" : 1 00:02:23.039 Fetching value of define "__AVX512CD__" : 1 00:02:23.039 Fetching value of define "__AVX512DQ__" : 1 00:02:23.039 Fetching value of define "__AVX512F__" : 1 00:02:23.039 Fetching value of define "__AVX512VL__" : 1 00:02:23.039 Fetching value of define "__PCLMUL__" : 1 00:02:23.039 Fetching value of define "__RDRND__" : 1 00:02:23.039 Fetching value of define "__RDSEED__" : 1 00:02:23.039 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:23.039 Fetching value of define "__znver1__" : (undefined) 00:02:23.039 Fetching value of define "__znver2__" : (undefined) 00:02:23.039 Fetching value of define "__znver3__" : (undefined) 00:02:23.039 Fetching value of define "__znver4__" : (undefined) 00:02:23.039 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:23.039 Message: lib/log: Defining dependency "log" 00:02:23.039 Message: lib/kvargs: Defining dependency "kvargs" 00:02:23.039 Message: lib/telemetry: Defining dependency "telemetry" 00:02:23.039 Checking for function "getentropy" : NO 00:02:23.039 Message: lib/eal: Defining dependency "eal" 00:02:23.039 Message: lib/ring: Defining dependency "ring" 00:02:23.039 Message: lib/rcu: Defining dependency "rcu" 00:02:23.039 Message: lib/mempool: Defining dependency "mempool" 00:02:23.039 Message: lib/mbuf: Defining dependency "mbuf" 00:02:23.039 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:23.039 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:23.039 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:23.039 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:23.039 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:23.039 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:23.039 Compiler for C supports arguments -mpclmul: YES 00:02:23.039 Compiler for C supports arguments -maes: YES 00:02:23.039 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:23.039 Compiler for C supports arguments -mavx512bw: YES 00:02:23.039 Compiler for C supports arguments -mavx512dq: YES 00:02:23.039 Compiler for C supports arguments -mavx512vl: YES 00:02:23.039 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:23.039 Compiler for C supports arguments -mavx2: YES 00:02:23.039 Compiler for C supports arguments -mavx: YES 00:02:23.039 Message: lib/net: Defining dependency "net" 00:02:23.039 Message: lib/meter: Defining dependency "meter" 00:02:23.039 Message: lib/ethdev: Defining dependency "ethdev" 00:02:23.039 Message: lib/pci: Defining dependency "pci" 00:02:23.040 Message: lib/cmdline: Defining dependency "cmdline" 00:02:23.040 Message: lib/hash: Defining dependency "hash" 00:02:23.040 Message: lib/timer: Defining dependency "timer" 00:02:23.040 Message: lib/compressdev: Defining dependency "compressdev" 00:02:23.040 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:23.040 Message: lib/dmadev: Defining dependency "dmadev" 00:02:23.040 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:23.040 Message: lib/power: Defining dependency "power" 00:02:23.040 Message: lib/reorder: Defining dependency "reorder" 00:02:23.040 Message: lib/security: Defining dependency "security" 00:02:23.040 Has header "linux/userfaultfd.h" : YES 00:02:23.040 Has header "linux/vduse.h" : YES 00:02:23.040 Message: lib/vhost: Defining dependency "vhost" 00:02:23.040 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:23.040 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:23.040 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:23.040 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:23.040 Compiler for C supports arguments -std=c11: YES 00:02:23.040 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:23.040 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:23.040 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:23.040 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:23.040 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:23.040 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:23.040 Library mtcr_ul found: NO 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:23.040 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:25.637 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:25.637 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:25.637 Configuring mlx5_autoconf.h using configuration 00:02:25.637 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:25.637 Run-time dependency libcrypto found: YES 3.0.9 00:02:25.637 Library IPSec_MB found: YES 00:02:25.637 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:25.637 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:25.637 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:25.637 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:25.637 Library IPSec_MB found: YES 00:02:25.637 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:25.637 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:25.637 Compiler for C supports arguments -std=c11: YES (cached) 00:02:25.637 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:25.637 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:25.637 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:25.637 Library libisal found: NO 00:02:25.637 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:25.637 Compiler for C supports arguments -std=c11: YES (cached) 00:02:25.637 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:25.637 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:25.637 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:25.637 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:25.637 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:25.637 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:25.637 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:25.637 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:25.637 Program doxygen found: YES (/usr/bin/doxygen) 00:02:25.637 Configuring doxy-api-html.conf using configuration 00:02:25.637 Configuring doxy-api-man.conf using configuration 00:02:25.637 Program mandb found: YES (/usr/bin/mandb) 00:02:25.637 Program sphinx-build found: NO 00:02:25.637 Configuring rte_build_config.h using configuration 00:02:25.637 Message: 00:02:25.637 ================= 00:02:25.637 Applications Enabled 00:02:25.637 ================= 00:02:25.637 00:02:25.637 apps: 00:02:25.637 00:02:25.637 00:02:25.637 Message: 00:02:25.637 ================= 00:02:25.637 Libraries Enabled 00:02:25.637 ================= 00:02:25.637 00:02:25.637 libs: 00:02:25.637 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:25.637 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:25.637 cryptodev, dmadev, power, reorder, security, vhost, 00:02:25.637 00:02:25.637 Message: 00:02:25.637 =============== 00:02:25.637 Drivers Enabled 00:02:25.637 =============== 00:02:25.637 00:02:25.637 common: 00:02:25.637 mlx5, qat, 00:02:25.637 bus: 00:02:25.637 auxiliary, pci, vdev, 00:02:25.637 mempool: 00:02:25.637 ring, 00:02:25.637 dma: 00:02:25.637 00:02:25.637 net: 00:02:25.637 00:02:25.637 crypto: 00:02:25.637 ipsec_mb, mlx5, 00:02:25.637 compress: 00:02:25.637 isal, mlx5, 00:02:25.637 vdpa: 00:02:25.637 00:02:25.637 00:02:25.637 Message: 00:02:25.637 ================= 00:02:25.637 Content Skipped 00:02:25.637 ================= 00:02:25.637 00:02:25.637 apps: 00:02:25.637 dumpcap: explicitly disabled via build config 00:02:25.637 graph: explicitly disabled via build config 00:02:25.637 pdump: explicitly disabled via build config 00:02:25.637 proc-info: explicitly disabled via build config 00:02:25.637 test-acl: explicitly disabled via build config 00:02:25.637 test-bbdev: explicitly disabled via build config 00:02:25.637 test-cmdline: explicitly disabled via build config 00:02:25.637 test-compress-perf: explicitly disabled via build config 00:02:25.637 test-crypto-perf: explicitly disabled via build config 00:02:25.637 test-dma-perf: explicitly disabled via build config 00:02:25.638 test-eventdev: explicitly disabled via build config 00:02:25.638 test-fib: explicitly disabled via build config 00:02:25.638 test-flow-perf: explicitly disabled via build config 00:02:25.638 test-gpudev: explicitly disabled via build config 00:02:25.638 test-mldev: explicitly disabled via build config 00:02:25.638 test-pipeline: explicitly disabled via build config 00:02:25.638 test-pmd: explicitly disabled via build config 00:02:25.638 test-regex: explicitly disabled via build config 00:02:25.638 test-sad: explicitly disabled via build config 00:02:25.638 test-security-perf: explicitly disabled via build config 00:02:25.638 00:02:25.638 libs: 00:02:25.638 argparse: explicitly disabled via build config 00:02:25.638 metrics: explicitly disabled via build config 00:02:25.638 acl: explicitly disabled via build config 00:02:25.638 bbdev: explicitly disabled via build config 00:02:25.638 bitratestats: explicitly disabled via build config 00:02:25.638 bpf: explicitly disabled via build config 00:02:25.638 cfgfile: explicitly disabled via build config 00:02:25.638 distributor: explicitly disabled via build config 00:02:25.638 efd: explicitly disabled via build config 00:02:25.638 eventdev: explicitly disabled via build config 00:02:25.638 dispatcher: explicitly disabled via build config 00:02:25.638 gpudev: explicitly disabled via build config 00:02:25.638 gro: explicitly disabled via build config 00:02:25.638 gso: explicitly disabled via build config 00:02:25.638 ip_frag: explicitly disabled via build config 00:02:25.638 jobstats: explicitly disabled via build config 00:02:25.638 latencystats: explicitly disabled via build config 00:02:25.638 lpm: explicitly disabled via build config 00:02:25.638 member: explicitly disabled via build config 00:02:25.638 pcapng: explicitly disabled via build config 00:02:25.638 rawdev: explicitly disabled via build config 00:02:25.638 regexdev: explicitly disabled via build config 00:02:25.638 mldev: explicitly disabled via build config 00:02:25.638 rib: explicitly disabled via build config 00:02:25.638 sched: explicitly disabled via build config 00:02:25.638 stack: explicitly disabled via build config 00:02:25.638 ipsec: explicitly disabled via build config 00:02:25.638 pdcp: explicitly disabled via build config 00:02:25.638 fib: explicitly disabled via build config 00:02:25.638 port: explicitly disabled via build config 00:02:25.638 pdump: explicitly disabled via build config 00:02:25.638 table: explicitly disabled via build config 00:02:25.638 pipeline: explicitly disabled via build config 00:02:25.638 graph: explicitly disabled via build config 00:02:25.638 node: explicitly disabled via build config 00:02:25.638 00:02:25.638 drivers: 00:02:25.638 common/cpt: not in enabled drivers build config 00:02:25.638 common/dpaax: not in enabled drivers build config 00:02:25.638 common/iavf: not in enabled drivers build config 00:02:25.638 common/idpf: not in enabled drivers build config 00:02:25.638 common/ionic: not in enabled drivers build config 00:02:25.638 common/mvep: not in enabled drivers build config 00:02:25.638 common/octeontx: not in enabled drivers build config 00:02:25.638 bus/cdx: not in enabled drivers build config 00:02:25.638 bus/dpaa: not in enabled drivers build config 00:02:25.638 bus/fslmc: not in enabled drivers build config 00:02:25.638 bus/ifpga: not in enabled drivers build config 00:02:25.638 bus/platform: not in enabled drivers build config 00:02:25.638 bus/uacce: not in enabled drivers build config 00:02:25.638 bus/vmbus: not in enabled drivers build config 00:02:25.638 common/cnxk: not in enabled drivers build config 00:02:25.638 common/nfp: not in enabled drivers build config 00:02:25.638 common/nitrox: not in enabled drivers build config 00:02:25.638 common/sfc_efx: not in enabled drivers build config 00:02:25.638 mempool/bucket: not in enabled drivers build config 00:02:25.638 mempool/cnxk: not in enabled drivers build config 00:02:25.638 mempool/dpaa: not in enabled drivers build config 00:02:25.638 mempool/dpaa2: not in enabled drivers build config 00:02:25.638 mempool/octeontx: not in enabled drivers build config 00:02:25.638 mempool/stack: not in enabled drivers build config 00:02:25.638 dma/cnxk: not in enabled drivers build config 00:02:25.638 dma/dpaa: not in enabled drivers build config 00:02:25.638 dma/dpaa2: not in enabled drivers build config 00:02:25.638 dma/hisilicon: not in enabled drivers build config 00:02:25.638 dma/idxd: not in enabled drivers build config 00:02:25.638 dma/ioat: not in enabled drivers build config 00:02:25.638 dma/skeleton: not in enabled drivers build config 00:02:25.638 net/af_packet: not in enabled drivers build config 00:02:25.638 net/af_xdp: not in enabled drivers build config 00:02:25.638 net/ark: not in enabled drivers build config 00:02:25.638 net/atlantic: not in enabled drivers build config 00:02:25.638 net/avp: not in enabled drivers build config 00:02:25.638 net/axgbe: not in enabled drivers build config 00:02:25.638 net/bnx2x: not in enabled drivers build config 00:02:25.638 net/bnxt: not in enabled drivers build config 00:02:25.638 net/bonding: not in enabled drivers build config 00:02:25.638 net/cnxk: not in enabled drivers build config 00:02:25.638 net/cpfl: not in enabled drivers build config 00:02:25.638 net/cxgbe: not in enabled drivers build config 00:02:25.638 net/dpaa: not in enabled drivers build config 00:02:25.638 net/dpaa2: not in enabled drivers build config 00:02:25.638 net/e1000: not in enabled drivers build config 00:02:25.638 net/ena: not in enabled drivers build config 00:02:25.638 net/enetc: not in enabled drivers build config 00:02:25.638 net/enetfec: not in enabled drivers build config 00:02:25.638 net/enic: not in enabled drivers build config 00:02:25.638 net/failsafe: not in enabled drivers build config 00:02:25.638 net/fm10k: not in enabled drivers build config 00:02:25.638 net/gve: not in enabled drivers build config 00:02:25.638 net/hinic: not in enabled drivers build config 00:02:25.638 net/hns3: not in enabled drivers build config 00:02:25.638 net/i40e: not in enabled drivers build config 00:02:25.638 net/iavf: not in enabled drivers build config 00:02:25.638 net/ice: not in enabled drivers build config 00:02:25.638 net/idpf: not in enabled drivers build config 00:02:25.638 net/igc: not in enabled drivers build config 00:02:25.638 net/ionic: not in enabled drivers build config 00:02:25.638 net/ipn3ke: not in enabled drivers build config 00:02:25.638 net/ixgbe: not in enabled drivers build config 00:02:25.638 net/mana: not in enabled drivers build config 00:02:25.638 net/memif: not in enabled drivers build config 00:02:25.638 net/mlx4: not in enabled drivers build config 00:02:25.638 net/mlx5: not in enabled drivers build config 00:02:25.638 net/mvneta: not in enabled drivers build config 00:02:25.638 net/mvpp2: not in enabled drivers build config 00:02:25.638 net/netvsc: not in enabled drivers build config 00:02:25.638 net/nfb: not in enabled drivers build config 00:02:25.638 net/nfp: not in enabled drivers build config 00:02:25.638 net/ngbe: not in enabled drivers build config 00:02:25.638 net/null: not in enabled drivers build config 00:02:25.638 net/octeontx: not in enabled drivers build config 00:02:25.638 net/octeon_ep: not in enabled drivers build config 00:02:25.638 net/pcap: not in enabled drivers build config 00:02:25.638 net/pfe: not in enabled drivers build config 00:02:25.638 net/qede: not in enabled drivers build config 00:02:25.638 net/ring: not in enabled drivers build config 00:02:25.638 net/sfc: not in enabled drivers build config 00:02:25.638 net/softnic: not in enabled drivers build config 00:02:25.638 net/tap: not in enabled drivers build config 00:02:25.638 net/thunderx: not in enabled drivers build config 00:02:25.638 net/txgbe: not in enabled drivers build config 00:02:25.638 net/vdev_netvsc: not in enabled drivers build config 00:02:25.638 net/vhost: not in enabled drivers build config 00:02:25.638 net/virtio: not in enabled drivers build config 00:02:25.638 net/vmxnet3: not in enabled drivers build config 00:02:25.638 raw/*: missing internal dependency, "rawdev" 00:02:25.638 crypto/armv8: not in enabled drivers build config 00:02:25.638 crypto/bcmfs: not in enabled drivers build config 00:02:25.638 crypto/caam_jr: not in enabled drivers build config 00:02:25.638 crypto/ccp: not in enabled drivers build config 00:02:25.638 crypto/cnxk: not in enabled drivers build config 00:02:25.638 crypto/dpaa_sec: not in enabled drivers build config 00:02:25.638 crypto/dpaa2_sec: not in enabled drivers build config 00:02:25.638 crypto/mvsam: not in enabled drivers build config 00:02:25.638 crypto/nitrox: not in enabled drivers build config 00:02:25.638 crypto/null: not in enabled drivers build config 00:02:25.638 crypto/octeontx: not in enabled drivers build config 00:02:25.638 crypto/openssl: not in enabled drivers build config 00:02:25.638 crypto/scheduler: not in enabled drivers build config 00:02:25.638 crypto/uadk: not in enabled drivers build config 00:02:25.638 crypto/virtio: not in enabled drivers build config 00:02:25.638 compress/nitrox: not in enabled drivers build config 00:02:25.638 compress/octeontx: not in enabled drivers build config 00:02:25.638 compress/zlib: not in enabled drivers build config 00:02:25.638 regex/*: missing internal dependency, "regexdev" 00:02:25.638 ml/*: missing internal dependency, "mldev" 00:02:25.638 vdpa/ifc: not in enabled drivers build config 00:02:25.638 vdpa/mlx5: not in enabled drivers build config 00:02:25.638 vdpa/nfp: not in enabled drivers build config 00:02:25.638 vdpa/sfc: not in enabled drivers build config 00:02:25.638 event/*: missing internal dependency, "eventdev" 00:02:25.638 baseband/*: missing internal dependency, "bbdev" 00:02:25.638 gpu/*: missing internal dependency, "gpudev" 00:02:25.638 00:02:25.638 00:02:25.897 Build targets in project: 115 00:02:25.897 00:02:25.897 DPDK 24.03.0 00:02:25.897 00:02:25.897 User defined options 00:02:25.897 buildtype : debug 00:02:25.897 default_library : shared 00:02:25.897 libdir : lib 00:02:25.897 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:25.897 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:25.897 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:25.897 cpu_instruction_set: native 00:02:25.897 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:02:25.897 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:02:25.897 enable_docs : false 00:02:25.897 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:25.897 enable_kmods : false 00:02:25.897 max_lcores : 128 00:02:25.897 tests : false 00:02:25.897 00:02:25.897 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:26.469 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:26.469 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:26.469 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:26.469 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:26.469 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:26.726 [5/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:26.726 [6/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:26.726 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:26.726 [8/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:26.726 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:26.726 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:26.726 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:26.726 [12/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:26.726 [13/378] Linking static target lib/librte_kvargs.a 00:02:26.726 [14/378] Linking static target lib/librte_log.a 00:02:26.726 [15/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:26.726 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:26.726 [17/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:26.726 [18/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:26.726 [19/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:26.989 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:26.989 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:26.989 [22/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:26.989 [23/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:26.989 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:26.989 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:26.989 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:26.989 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:26.989 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:27.252 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:27.252 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:27.252 [31/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:27.252 [32/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:27.252 [33/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.252 [34/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:27.252 [35/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:27.252 [36/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:27.252 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:27.252 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:27.252 [39/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:27.252 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:27.252 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:27.252 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:27.252 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:27.252 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:27.252 [45/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:27.252 [46/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:27.252 [47/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:27.252 [48/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:27.252 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:27.252 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:27.252 [51/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:27.252 [52/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:27.252 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:27.252 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:27.252 [55/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:27.252 [56/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:27.252 [57/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:27.252 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:27.252 [59/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:27.252 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:27.252 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:27.252 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:27.252 [63/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:27.252 [64/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:27.252 [65/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:27.252 [66/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:27.252 [67/378] Linking static target lib/librte_ring.a 00:02:27.252 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:27.252 [69/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:27.252 [70/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:27.252 [71/378] Linking static target lib/librte_pci.a 00:02:27.252 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:27.252 [73/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:27.252 [74/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:27.252 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:27.252 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:27.252 [77/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:27.252 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:27.252 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:27.252 [80/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:27.252 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:27.252 [82/378] Linking static target lib/librte_telemetry.a 00:02:27.252 [83/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:27.252 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:27.252 [85/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:27.252 [86/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:27.252 [87/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:27.252 [88/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:27.252 [89/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:27.252 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:27.252 [91/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:27.511 [92/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:27.511 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:27.511 [94/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:27.511 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:27.511 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:27.511 [97/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:27.511 [98/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:27.511 [99/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:27.511 [100/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:27.511 [101/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:27.511 [102/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:27.511 [103/378] Linking static target lib/librte_mempool.a 00:02:27.511 [104/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:27.511 [105/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:27.511 [106/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:27.511 [107/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:27.511 [108/378] Linking static target lib/librte_rcu.a 00:02:27.511 [109/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.511 [110/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:27.511 [111/378] Linking static target lib/librte_meter.a 00:02:27.511 [112/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:27.511 [113/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:27.511 [114/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:27.511 [115/378] Linking static target lib/librte_mbuf.a 00:02:27.511 [116/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:27.511 [117/378] Linking target lib/librte_log.so.24.1 00:02:27.771 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:27.771 [119/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:27.771 [120/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.771 [121/378] Linking static target lib/librte_net.a 00:02:27.771 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:27.771 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:27.771 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:27.771 [125/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.771 [126/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:27.771 [127/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:27.771 [128/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:27.771 [129/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:28.035 [130/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:28.035 [131/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:28.035 [132/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:28.035 [133/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.035 [134/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:28.035 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:28.035 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:28.035 [137/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:28.035 [138/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:28.035 [139/378] Linking target lib/librte_kvargs.so.24.1 00:02:28.035 [140/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:28.035 [141/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:28.035 [142/378] Linking static target lib/librte_timer.a 00:02:28.035 [143/378] Linking static target lib/librte_cmdline.a 00:02:28.035 [144/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:28.035 [145/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:28.035 [146/378] Linking static target lib/librte_dmadev.a 00:02:28.035 [147/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:28.035 [148/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:28.035 [149/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:28.035 [150/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:28.035 [151/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:28.035 [152/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:28.035 [153/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:28.035 [154/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.035 [155/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.035 [156/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:28.035 [157/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:28.035 [158/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:28.036 [159/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:28.036 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:28.036 [161/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:28.036 [162/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:28.036 [163/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:28.036 [164/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:28.036 [165/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:28.036 [166/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:28.036 [167/378] Linking target lib/librte_telemetry.so.24.1 00:02:28.036 [168/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:28.036 [169/378] Linking static target lib/librte_eal.a 00:02:28.036 [170/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:28.036 [171/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:28.036 [172/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:28.036 [173/378] Linking static target lib/librte_compressdev.a 00:02:28.036 [174/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.036 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:28.300 [176/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:28.300 [177/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:28.300 [178/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:28.301 [179/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:28.301 [180/378] Linking static target lib/librte_power.a 00:02:28.301 [181/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:28.301 [182/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:28.301 [183/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:28.301 [184/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:28.301 [185/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:28.301 [186/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:28.301 [187/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:28.301 [188/378] Linking static target lib/librte_reorder.a 00:02:28.301 [189/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:28.301 [190/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:28.301 [191/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:28.301 [192/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:28.301 [193/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:28.301 [194/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:28.301 [195/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:28.301 [196/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:28.301 [197/378] Linking static target lib/librte_security.a 00:02:28.301 [198/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:28.563 [199/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:28.563 [200/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:28.563 [201/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:28.563 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:28.563 [203/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:28.563 [204/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:28.563 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:28.563 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:28.563 [207/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.563 [208/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.563 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:28.563 [210/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.563 [211/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:28.563 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:28.563 [213/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:28.563 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:28.563 [215/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:28.563 [216/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:28.563 [217/378] Linking static target lib/librte_hash.a 00:02:28.563 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:28.563 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:28.563 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:28.563 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:28.563 [222/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:28.563 [223/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:28.823 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:28.823 [225/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:28.823 [226/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:28.823 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:28.823 [228/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:28.823 [229/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:28.823 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:28.823 [231/378] Linking static target drivers/librte_bus_pci.a 00:02:28.823 [232/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:28.823 [233/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:28.823 [234/378] Linking static target drivers/librte_bus_vdev.a 00:02:28.823 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:28.823 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:28.824 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:28.824 [238/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:28.824 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:28.824 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:28.824 [241/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.824 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:28.824 [243/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.824 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:28.824 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:28.824 [246/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.824 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:28.824 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:28.824 [249/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:28.824 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:28.824 [251/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:28.824 [252/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:28.824 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:28.824 [254/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.824 [255/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:28.824 [256/378] Linking static target lib/librte_cryptodev.a 00:02:28.824 [257/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.824 [258/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:28.824 [259/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:29.081 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:29.081 [261/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:29.081 [262/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:29.081 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:29.081 [264/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:29.081 [265/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.081 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:29.081 [267/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:29.081 [268/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:29.081 [269/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.081 [270/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:29.081 [271/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:29.081 [272/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:29.081 [273/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:29.081 [274/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.081 [275/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:29.081 [276/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.081 [277/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:29.081 [278/378] Linking static target drivers/librte_mempool_ring.a 00:02:29.340 [279/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:29.340 [280/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.340 [281/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:29.340 [282/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:29.340 [283/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:29.340 [284/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:29.340 [285/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:29.340 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:29.340 [287/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:29.340 [288/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:29.340 [289/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:29.340 [290/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:29.340 [291/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:29.340 [292/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:29.599 [293/378] Linking static target drivers/librte_compress_isal.a 00:02:29.599 [294/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:29.599 [295/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:29.599 [296/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.599 [297/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:29.599 [298/378] Linking static target lib/librte_ethdev.a 00:02:29.599 [299/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.599 [300/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:29.599 [301/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:29.599 [302/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:29.599 [303/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:29.599 [304/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:29.599 [305/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:29.599 [306/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:29.599 [307/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:29.599 [308/378] Linking static target drivers/librte_compress_mlx5.a 00:02:29.599 [309/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:29.857 [310/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:29.857 [311/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:29.857 [312/378] Linking static target drivers/librte_common_mlx5.a 00:02:29.857 [313/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:29.857 [314/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:29.857 [315/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:29.857 [316/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:30.117 [317/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:30.377 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:30.377 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:30.635 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:30.894 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:30.894 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:30.894 [323/378] Linking static target drivers/librte_common_qat.a 00:02:30.894 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:30.894 [325/378] Linking static target lib/librte_vhost.a 00:02:31.153 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.056 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.600 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.904 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.806 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.066 [331/378] Linking target lib/librte_eal.so.24.1 00:02:41.066 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:41.324 [333/378] Linking target lib/librte_meter.so.24.1 00:02:41.324 [334/378] Linking target lib/librte_dmadev.so.24.1 00:02:41.324 [335/378] Linking target lib/librte_ring.so.24.1 00:02:41.324 [336/378] Linking target lib/librte_timer.so.24.1 00:02:41.324 [337/378] Linking target lib/librte_pci.so.24.1 00:02:41.324 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:41.324 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:41.324 [340/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:41.324 [341/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:41.324 [342/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:41.324 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:41.324 [344/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:41.324 [345/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:41.324 [346/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:41.324 [347/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:41.324 [348/378] Linking target lib/librte_mempool.so.24.1 00:02:41.324 [349/378] Linking target lib/librte_rcu.so.24.1 00:02:41.584 [350/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:41.584 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:41.584 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:41.584 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:41.584 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:41.884 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:41.884 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:41.884 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:41.884 [358/378] Linking target lib/librte_cryptodev.so.24.1 00:02:41.884 [359/378] Linking target lib/librte_net.so.24.1 00:02:42.142 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:42.142 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:42.142 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:42.142 [363/378] Linking target lib/librte_hash.so.24.1 00:02:42.142 [364/378] Linking target lib/librte_security.so.24.1 00:02:42.142 [365/378] Linking target lib/librte_cmdline.so.24.1 00:02:42.142 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:42.142 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:42.142 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:42.142 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:42.142 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:42.399 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:42.399 [372/378] Linking target lib/librte_power.so.24.1 00:02:42.399 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:42.399 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:42.658 [375/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:42.658 [376/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:42.658 [377/378] Linking target drivers/librte_common_qat.so.24.1 00:02:42.658 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:42.658 INFO: autodetecting backend as ninja 00:02:42.658 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:44.035 CC lib/ut/ut.o 00:02:44.035 CC lib/ut_mock/mock.o 00:02:44.035 CC lib/log/log.o 00:02:44.035 CC lib/log/log_flags.o 00:02:44.035 CC lib/log/log_deprecated.o 00:02:44.035 LIB libspdk_ut.a 00:02:44.035 LIB libspdk_ut_mock.a 00:02:44.035 SO libspdk_ut.so.2.0 00:02:44.293 SO libspdk_ut_mock.so.6.0 00:02:44.293 SYMLINK libspdk_ut.so 00:02:44.293 SYMLINK libspdk_ut_mock.so 00:02:44.293 LIB libspdk_log.a 00:02:44.293 SO libspdk_log.so.7.0 00:02:44.293 SYMLINK libspdk_log.so 00:02:44.858 CC lib/util/bit_array.o 00:02:44.858 CC lib/util/cpuset.o 00:02:44.858 CC lib/dma/dma.o 00:02:44.858 CC lib/util/base64.o 00:02:44.858 CC lib/util/crc32.o 00:02:44.858 CC lib/ioat/ioat.o 00:02:44.858 CC lib/util/crc16.o 00:02:44.858 CC lib/util/crc32c.o 00:02:44.858 CC lib/util/crc64.o 00:02:44.858 CC lib/util/crc32_ieee.o 00:02:44.858 CC lib/util/dif.o 00:02:44.858 CC lib/util/fd.o 00:02:44.858 CC lib/util/file.o 00:02:44.858 CC lib/util/hexlify.o 00:02:44.858 CC lib/util/iov.o 00:02:44.858 CC lib/util/math.o 00:02:44.858 CC lib/util/pipe.o 00:02:44.858 CC lib/util/strerror_tls.o 00:02:44.858 CXX lib/trace_parser/trace.o 00:02:44.858 CC lib/util/string.o 00:02:44.858 CC lib/util/uuid.o 00:02:44.858 CC lib/util/fd_group.o 00:02:44.858 CC lib/util/xor.o 00:02:44.858 CC lib/util/zipf.o 00:02:44.858 CC lib/vfio_user/host/vfio_user_pci.o 00:02:44.858 CC lib/vfio_user/host/vfio_user.o 00:02:44.858 LIB libspdk_dma.a 00:02:44.858 SO libspdk_dma.so.4.0 00:02:45.116 SYMLINK libspdk_dma.so 00:02:45.116 LIB libspdk_ioat.a 00:02:45.116 SO libspdk_ioat.so.7.0 00:02:45.116 SYMLINK libspdk_ioat.so 00:02:45.116 LIB libspdk_vfio_user.a 00:02:45.374 SO libspdk_vfio_user.so.5.0 00:02:45.374 LIB libspdk_util.a 00:02:45.374 SYMLINK libspdk_vfio_user.so 00:02:45.374 SO libspdk_util.so.9.1 00:02:45.632 SYMLINK libspdk_util.so 00:02:45.632 LIB libspdk_trace_parser.a 00:02:45.632 SO libspdk_trace_parser.so.5.0 00:02:45.890 SYMLINK libspdk_trace_parser.so 00:02:45.890 CC lib/env_dpdk/env.o 00:02:45.890 CC lib/env_dpdk/memory.o 00:02:45.890 CC lib/env_dpdk/pci.o 00:02:45.890 CC lib/env_dpdk/init.o 00:02:45.890 CC lib/env_dpdk/pci_ioat.o 00:02:45.890 CC lib/env_dpdk/pci_virtio.o 00:02:45.890 CC lib/env_dpdk/threads.o 00:02:45.890 CC lib/env_dpdk/pci_vmd.o 00:02:45.890 CC lib/conf/conf.o 00:02:45.890 CC lib/env_dpdk/pci_idxd.o 00:02:45.890 CC lib/env_dpdk/sigbus_handler.o 00:02:45.890 CC lib/env_dpdk/pci_event.o 00:02:45.890 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:45.890 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:45.890 CC lib/env_dpdk/pci_dpdk.o 00:02:45.890 CC lib/rdma_utils/rdma_utils.o 00:02:45.890 CC lib/vmd/led.o 00:02:45.890 CC lib/vmd/vmd.o 00:02:45.890 CC lib/idxd/idxd.o 00:02:45.890 CC lib/rdma_provider/common.o 00:02:45.890 CC lib/idxd/idxd_user.o 00:02:45.890 CC lib/json/json_parse.o 00:02:45.890 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:45.890 CC lib/idxd/idxd_kernel.o 00:02:45.890 CC lib/json/json_util.o 00:02:45.890 CC lib/reduce/reduce.o 00:02:45.890 CC lib/json/json_write.o 00:02:46.149 LIB libspdk_rdma_provider.a 00:02:46.149 LIB libspdk_conf.a 00:02:46.149 SO libspdk_rdma_provider.so.6.0 00:02:46.149 LIB libspdk_json.a 00:02:46.149 SO libspdk_conf.so.6.0 00:02:46.149 LIB libspdk_rdma_utils.a 00:02:46.149 SO libspdk_json.so.6.0 00:02:46.149 SO libspdk_rdma_utils.so.1.0 00:02:46.407 SYMLINK libspdk_rdma_provider.so 00:02:46.407 SYMLINK libspdk_conf.so 00:02:46.407 SYMLINK libspdk_rdma_utils.so 00:02:46.407 SYMLINK libspdk_json.so 00:02:46.407 LIB libspdk_idxd.a 00:02:46.665 SO libspdk_idxd.so.12.0 00:02:46.665 LIB libspdk_vmd.a 00:02:46.665 LIB libspdk_reduce.a 00:02:46.665 SO libspdk_vmd.so.6.0 00:02:46.665 SO libspdk_reduce.so.6.0 00:02:46.665 SYMLINK libspdk_idxd.so 00:02:46.665 CC lib/jsonrpc/jsonrpc_server.o 00:02:46.665 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:46.665 CC lib/jsonrpc/jsonrpc_client.o 00:02:46.665 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:46.665 SYMLINK libspdk_reduce.so 00:02:46.665 SYMLINK libspdk_vmd.so 00:02:46.924 LIB libspdk_jsonrpc.a 00:02:46.924 SO libspdk_jsonrpc.so.6.0 00:02:47.181 SYMLINK libspdk_jsonrpc.so 00:02:47.440 CC lib/rpc/rpc.o 00:02:47.440 LIB libspdk_env_dpdk.a 00:02:47.699 SO libspdk_env_dpdk.so.14.1 00:02:47.699 LIB libspdk_rpc.a 00:02:47.699 SYMLINK libspdk_env_dpdk.so 00:02:47.699 SO libspdk_rpc.so.6.0 00:02:47.957 SYMLINK libspdk_rpc.so 00:02:48.215 CC lib/notify/notify.o 00:02:48.215 CC lib/notify/notify_rpc.o 00:02:48.215 CC lib/trace/trace.o 00:02:48.216 CC lib/trace/trace_flags.o 00:02:48.216 CC lib/trace/trace_rpc.o 00:02:48.216 CC lib/keyring/keyring.o 00:02:48.216 CC lib/keyring/keyring_rpc.o 00:02:48.474 LIB libspdk_notify.a 00:02:48.474 SO libspdk_notify.so.6.0 00:02:48.474 LIB libspdk_trace.a 00:02:48.474 LIB libspdk_keyring.a 00:02:48.474 SO libspdk_trace.so.10.0 00:02:48.474 SYMLINK libspdk_notify.so 00:02:48.474 SO libspdk_keyring.so.1.0 00:02:48.474 SYMLINK libspdk_trace.so 00:02:48.732 SYMLINK libspdk_keyring.so 00:02:48.990 CC lib/sock/sock.o 00:02:48.990 CC lib/sock/sock_rpc.o 00:02:48.990 CC lib/thread/thread.o 00:02:48.990 CC lib/thread/iobuf.o 00:02:49.248 LIB libspdk_sock.a 00:02:49.505 SO libspdk_sock.so.10.0 00:02:49.505 SYMLINK libspdk_sock.so 00:02:49.763 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:49.763 CC lib/nvme/nvme_ctrlr.o 00:02:49.763 CC lib/nvme/nvme_fabric.o 00:02:49.763 CC lib/nvme/nvme_ns.o 00:02:49.763 CC lib/nvme/nvme_ns_cmd.o 00:02:49.763 CC lib/nvme/nvme_pcie_common.o 00:02:49.763 CC lib/nvme/nvme_pcie.o 00:02:49.763 CC lib/nvme/nvme_qpair.o 00:02:49.763 CC lib/nvme/nvme.o 00:02:49.763 CC lib/nvme/nvme_quirks.o 00:02:49.763 CC lib/nvme/nvme_transport.o 00:02:49.763 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:49.763 CC lib/nvme/nvme_discovery.o 00:02:49.763 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:49.763 CC lib/nvme/nvme_tcp.o 00:02:49.763 CC lib/nvme/nvme_opal.o 00:02:49.763 CC lib/nvme/nvme_io_msg.o 00:02:49.763 CC lib/nvme/nvme_poll_group.o 00:02:49.763 CC lib/nvme/nvme_stubs.o 00:02:49.763 CC lib/nvme/nvme_zns.o 00:02:49.763 CC lib/nvme/nvme_cuse.o 00:02:49.763 CC lib/nvme/nvme_auth.o 00:02:49.763 CC lib/nvme/nvme_rdma.o 00:02:50.694 LIB libspdk_thread.a 00:02:50.694 SO libspdk_thread.so.10.1 00:02:50.694 SYMLINK libspdk_thread.so 00:02:50.951 CC lib/accel/accel_rpc.o 00:02:50.951 CC lib/accel/accel.o 00:02:50.951 CC lib/accel/accel_sw.o 00:02:50.951 CC lib/virtio/virtio.o 00:02:50.951 CC lib/virtio/virtio_vhost_user.o 00:02:50.951 CC lib/virtio/virtio_vfio_user.o 00:02:50.951 CC lib/blob/blobstore.o 00:02:50.951 CC lib/blob/zeroes.o 00:02:50.951 CC lib/blob/blob_bs_dev.o 00:02:50.951 CC lib/blob/request.o 00:02:50.951 CC lib/virtio/virtio_pci.o 00:02:50.951 CC lib/init/subsystem.o 00:02:50.951 CC lib/init/json_config.o 00:02:50.951 CC lib/init/subsystem_rpc.o 00:02:50.951 CC lib/init/rpc.o 00:02:51.208 LIB libspdk_init.a 00:02:51.208 SO libspdk_init.so.5.0 00:02:51.209 LIB libspdk_virtio.a 00:02:51.467 SO libspdk_virtio.so.7.0 00:02:51.467 SYMLINK libspdk_init.so 00:02:51.467 SYMLINK libspdk_virtio.so 00:02:51.725 CC lib/event/reactor.o 00:02:51.725 CC lib/event/app.o 00:02:51.725 CC lib/event/app_rpc.o 00:02:51.725 CC lib/event/log_rpc.o 00:02:51.725 CC lib/event/scheduler_static.o 00:02:51.982 LIB libspdk_accel.a 00:02:51.982 SO libspdk_accel.so.15.1 00:02:51.982 LIB libspdk_nvme.a 00:02:51.982 SYMLINK libspdk_accel.so 00:02:52.238 LIB libspdk_event.a 00:02:52.238 SO libspdk_event.so.14.0 00:02:52.238 SO libspdk_nvme.so.13.1 00:02:52.238 SYMLINK libspdk_event.so 00:02:52.495 CC lib/bdev/bdev.o 00:02:52.495 CC lib/bdev/bdev_zone.o 00:02:52.495 CC lib/bdev/part.o 00:02:52.495 CC lib/bdev/bdev_rpc.o 00:02:52.495 CC lib/bdev/scsi_nvme.o 00:02:52.495 SYMLINK libspdk_nvme.so 00:02:53.867 LIB libspdk_blob.a 00:02:54.125 SO libspdk_blob.so.11.0 00:02:54.125 SYMLINK libspdk_blob.so 00:02:54.125 LIB libspdk_bdev.a 00:02:54.125 SO libspdk_bdev.so.15.1 00:02:54.389 SYMLINK libspdk_bdev.so 00:02:54.657 CC lib/lvol/lvol.o 00:02:54.657 CC lib/blobfs/blobfs.o 00:02:54.657 CC lib/blobfs/tree.o 00:02:54.657 CC lib/nvmf/ctrlr.o 00:02:54.657 CC lib/nvmf/ctrlr_discovery.o 00:02:54.657 CC lib/nvmf/ctrlr_bdev.o 00:02:54.657 CC lib/nvmf/subsystem.o 00:02:54.657 CC lib/scsi/dev.o 00:02:54.657 CC lib/nbd/nbd.o 00:02:54.657 CC lib/nvmf/nvmf.o 00:02:54.657 CC lib/nvmf/tcp.o 00:02:54.657 CC lib/nvmf/nvmf_rpc.o 00:02:54.657 CC lib/nbd/nbd_rpc.o 00:02:54.657 CC lib/nvmf/transport.o 00:02:54.657 CC lib/scsi/lun.o 00:02:54.657 CC lib/ublk/ublk_rpc.o 00:02:54.657 CC lib/ublk/ublk.o 00:02:54.657 CC lib/scsi/port.o 00:02:54.657 CC lib/nvmf/stubs.o 00:02:54.657 CC lib/scsi/scsi.o 00:02:54.657 CC lib/nvmf/mdns_server.o 00:02:54.657 CC lib/scsi/scsi_rpc.o 00:02:54.657 CC lib/nvmf/rdma.o 00:02:54.657 CC lib/scsi/scsi_bdev.o 00:02:54.657 CC lib/scsi/scsi_pr.o 00:02:54.657 CC lib/ftl/ftl_core.o 00:02:54.657 CC lib/nvmf/auth.o 00:02:54.657 CC lib/scsi/task.o 00:02:54.657 CC lib/ftl/ftl_init.o 00:02:54.657 CC lib/ftl/ftl_layout.o 00:02:54.657 CC lib/ftl/ftl_debug.o 00:02:54.657 CC lib/ftl/ftl_io.o 00:02:54.657 CC lib/ftl/ftl_sb.o 00:02:54.657 CC lib/ftl/ftl_l2p.o 00:02:54.657 CC lib/ftl/ftl_l2p_flat.o 00:02:54.657 CC lib/ftl/ftl_nv_cache.o 00:02:54.657 CC lib/ftl/ftl_band.o 00:02:54.657 CC lib/ftl/ftl_band_ops.o 00:02:54.657 CC lib/ftl/ftl_writer.o 00:02:54.657 CC lib/ftl/ftl_rq.o 00:02:54.657 CC lib/ftl/ftl_reloc.o 00:02:54.657 CC lib/ftl/ftl_l2p_cache.o 00:02:54.657 CC lib/ftl/ftl_p2l.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:54.657 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:54.657 CC lib/ftl/utils/ftl_conf.o 00:02:54.657 CC lib/ftl/utils/ftl_md.o 00:02:54.657 CC lib/ftl/utils/ftl_mempool.o 00:02:54.657 CC lib/ftl/utils/ftl_property.o 00:02:54.657 CC lib/ftl/utils/ftl_bitmap.o 00:02:54.657 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:54.657 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:54.657 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:54.657 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:54.657 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:54.657 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:54.657 CC lib/ftl/base/ftl_base_dev.o 00:02:55.228 CC lib/ftl/base/ftl_base_bdev.o 00:02:55.228 CC lib/ftl/ftl_trace.o 00:02:55.487 LIB libspdk_nbd.a 00:02:55.487 LIB libspdk_scsi.a 00:02:55.487 SO libspdk_nbd.so.7.0 00:02:55.487 LIB libspdk_ublk.a 00:02:55.487 SO libspdk_scsi.so.9.0 00:02:55.487 SO libspdk_ublk.so.3.0 00:02:55.487 SYMLINK libspdk_nbd.so 00:02:55.745 SYMLINK libspdk_ublk.so 00:02:55.745 LIB libspdk_blobfs.a 00:02:55.745 SYMLINK libspdk_scsi.so 00:02:55.745 SO libspdk_blobfs.so.10.0 00:02:55.745 SYMLINK libspdk_blobfs.so 00:02:55.745 LIB libspdk_lvol.a 00:02:55.745 SO libspdk_lvol.so.10.0 00:02:56.004 SYMLINK libspdk_lvol.so 00:02:56.004 LIB libspdk_ftl.a 00:02:56.004 CC lib/iscsi/conn.o 00:02:56.004 CC lib/iscsi/init_grp.o 00:02:56.004 CC lib/iscsi/md5.o 00:02:56.004 CC lib/iscsi/param.o 00:02:56.004 CC lib/vhost/vhost.o 00:02:56.004 CC lib/iscsi/iscsi.o 00:02:56.004 CC lib/vhost/vhost_rpc.o 00:02:56.004 CC lib/vhost/vhost_blk.o 00:02:56.004 CC lib/vhost/vhost_scsi.o 00:02:56.004 CC lib/iscsi/portal_grp.o 00:02:56.004 CC lib/iscsi/tgt_node.o 00:02:56.004 CC lib/vhost/rte_vhost_user.o 00:02:56.004 CC lib/iscsi/iscsi_subsystem.o 00:02:56.004 CC lib/iscsi/iscsi_rpc.o 00:02:56.004 CC lib/iscsi/task.o 00:02:56.004 SO libspdk_ftl.so.9.0 00:02:56.569 SYMLINK libspdk_ftl.so 00:02:57.207 LIB libspdk_nvmf.a 00:02:57.207 LIB libspdk_vhost.a 00:02:57.207 SO libspdk_nvmf.so.18.1 00:02:57.207 SO libspdk_vhost.so.8.0 00:02:57.207 SYMLINK libspdk_vhost.so 00:02:57.465 SYMLINK libspdk_nvmf.so 00:02:57.465 LIB libspdk_iscsi.a 00:02:57.465 SO libspdk_iscsi.so.8.0 00:02:57.723 SYMLINK libspdk_iscsi.so 00:02:58.290 CC module/env_dpdk/env_dpdk_rpc.o 00:02:58.290 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:58.290 CC module/accel/iaa/accel_iaa.o 00:02:58.290 CC module/accel/iaa/accel_iaa_rpc.o 00:02:58.290 CC module/scheduler/gscheduler/gscheduler.o 00:02:58.290 CC module/accel/error/accel_error.o 00:02:58.290 CC module/accel/error/accel_error_rpc.o 00:02:58.290 CC module/sock/posix/posix.o 00:02:58.290 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:58.290 CC module/accel/dsa/accel_dsa.o 00:02:58.290 CC module/accel/dsa/accel_dsa_rpc.o 00:02:58.290 CC module/keyring/file/keyring.o 00:02:58.290 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:58.290 CC module/blob/bdev/blob_bdev.o 00:02:58.290 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:58.290 CC module/keyring/file/keyring_rpc.o 00:02:58.290 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:58.290 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:58.290 CC module/accel/ioat/accel_ioat.o 00:02:58.290 CC module/accel/ioat/accel_ioat_rpc.o 00:02:58.290 CC module/keyring/linux/keyring_rpc.o 00:02:58.290 CC module/keyring/linux/keyring.o 00:02:58.290 LIB libspdk_env_dpdk_rpc.a 00:02:58.549 SO libspdk_env_dpdk_rpc.so.6.0 00:02:58.549 SYMLINK libspdk_env_dpdk_rpc.so 00:02:58.549 LIB libspdk_scheduler_gscheduler.a 00:02:58.549 LIB libspdk_scheduler_dpdk_governor.a 00:02:58.549 LIB libspdk_keyring_linux.a 00:02:58.549 LIB libspdk_scheduler_dynamic.a 00:02:58.549 SO libspdk_scheduler_gscheduler.so.4.0 00:02:58.549 LIB libspdk_accel_error.a 00:02:58.549 LIB libspdk_accel_iaa.a 00:02:58.549 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:58.549 LIB libspdk_accel_ioat.a 00:02:58.549 LIB libspdk_accel_dsa.a 00:02:58.549 SO libspdk_scheduler_dynamic.so.4.0 00:02:58.549 SO libspdk_keyring_linux.so.1.0 00:02:58.549 SO libspdk_accel_error.so.2.0 00:02:58.549 SO libspdk_accel_iaa.so.3.0 00:02:58.549 SYMLINK libspdk_scheduler_gscheduler.so 00:02:58.549 SO libspdk_accel_dsa.so.5.0 00:02:58.549 SO libspdk_accel_ioat.so.6.0 00:02:58.808 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:58.808 LIB libspdk_keyring_file.a 00:02:58.808 LIB libspdk_blob_bdev.a 00:02:58.808 SYMLINK libspdk_scheduler_dynamic.so 00:02:58.808 SYMLINK libspdk_keyring_linux.so 00:02:58.808 SYMLINK libspdk_accel_error.so 00:02:58.808 SYMLINK libspdk_accel_iaa.so 00:02:58.808 SYMLINK libspdk_accel_ioat.so 00:02:58.808 SO libspdk_keyring_file.so.1.0 00:02:58.808 SO libspdk_blob_bdev.so.11.0 00:02:58.808 SYMLINK libspdk_accel_dsa.so 00:02:58.808 SYMLINK libspdk_keyring_file.so 00:02:58.808 SYMLINK libspdk_blob_bdev.so 00:02:59.066 LIB libspdk_sock_posix.a 00:02:59.326 SO libspdk_sock_posix.so.6.0 00:02:59.326 CC module/blobfs/bdev/blobfs_bdev.o 00:02:59.326 CC module/bdev/crypto/vbdev_crypto.o 00:02:59.326 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:59.326 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:59.326 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:59.326 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:59.326 CC module/bdev/delay/vbdev_delay.o 00:02:59.326 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:59.326 CC module/bdev/lvol/vbdev_lvol.o 00:02:59.326 CC module/bdev/compress/vbdev_compress.o 00:02:59.326 CC module/bdev/iscsi/bdev_iscsi.o 00:02:59.326 SYMLINK libspdk_sock_posix.so 00:02:59.326 CC module/bdev/passthru/vbdev_passthru.o 00:02:59.326 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:59.326 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:59.326 CC module/bdev/raid/bdev_raid_rpc.o 00:02:59.326 CC module/bdev/raid/bdev_raid.o 00:02:59.326 CC module/bdev/error/vbdev_error.o 00:02:59.326 CC module/bdev/error/vbdev_error_rpc.o 00:02:59.326 CC module/bdev/raid/bdev_raid_sb.o 00:02:59.326 CC module/bdev/raid/concat.o 00:02:59.326 CC module/bdev/raid/raid1.o 00:02:59.326 CC module/bdev/raid/raid0.o 00:02:59.326 CC module/bdev/gpt/gpt.o 00:02:59.326 CC module/bdev/nvme/bdev_nvme.o 00:02:59.326 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:59.326 CC module/bdev/gpt/vbdev_gpt.o 00:02:59.326 CC module/bdev/nvme/nvme_rpc.o 00:02:59.326 CC module/bdev/nvme/bdev_mdns_client.o 00:02:59.326 CC module/bdev/malloc/bdev_malloc.o 00:02:59.326 CC module/bdev/nvme/vbdev_opal.o 00:02:59.326 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:59.326 CC module/bdev/ftl/bdev_ftl.o 00:02:59.326 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:59.326 CC module/bdev/aio/bdev_aio.o 00:02:59.326 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:59.326 CC module/bdev/aio/bdev_aio_rpc.o 00:02:59.326 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:59.326 CC module/bdev/split/vbdev_split.o 00:02:59.326 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:59.326 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:59.326 CC module/bdev/split/vbdev_split_rpc.o 00:02:59.326 CC module/bdev/null/bdev_null.o 00:02:59.326 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:59.326 CC module/bdev/null/bdev_null_rpc.o 00:02:59.326 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:59.326 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:59.585 LIB libspdk_accel_dpdk_cryptodev.a 00:02:59.585 LIB libspdk_accel_dpdk_compressdev.a 00:02:59.585 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:59.585 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:59.585 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:59.585 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:59.585 LIB libspdk_blobfs_bdev.a 00:02:59.585 LIB libspdk_bdev_gpt.a 00:02:59.585 SO libspdk_blobfs_bdev.so.6.0 00:02:59.585 LIB libspdk_bdev_passthru.a 00:02:59.585 SO libspdk_bdev_gpt.so.6.0 00:02:59.585 SO libspdk_bdev_passthru.so.6.0 00:02:59.843 LIB libspdk_bdev_split.a 00:02:59.843 SYMLINK libspdk_blobfs_bdev.so 00:02:59.843 LIB libspdk_bdev_error.a 00:02:59.843 LIB libspdk_bdev_null.a 00:02:59.843 LIB libspdk_bdev_compress.a 00:02:59.843 LIB libspdk_bdev_iscsi.a 00:02:59.843 LIB libspdk_bdev_delay.a 00:02:59.843 SYMLINK libspdk_bdev_gpt.so 00:02:59.843 SO libspdk_bdev_split.so.6.0 00:02:59.843 LIB libspdk_bdev_ftl.a 00:02:59.843 SYMLINK libspdk_bdev_passthru.so 00:02:59.843 SO libspdk_bdev_error.so.6.0 00:02:59.843 SO libspdk_bdev_iscsi.so.6.0 00:02:59.843 SO libspdk_bdev_delay.so.6.0 00:02:59.843 SO libspdk_bdev_null.so.6.0 00:02:59.843 SO libspdk_bdev_compress.so.6.0 00:02:59.843 LIB libspdk_bdev_crypto.a 00:02:59.843 SO libspdk_bdev_ftl.so.6.0 00:02:59.843 LIB libspdk_bdev_zone_block.a 00:02:59.843 LIB libspdk_bdev_aio.a 00:02:59.843 LIB libspdk_bdev_malloc.a 00:02:59.843 SYMLINK libspdk_bdev_split.so 00:02:59.843 SO libspdk_bdev_crypto.so.6.0 00:02:59.843 SO libspdk_bdev_aio.so.6.0 00:02:59.843 SO libspdk_bdev_zone_block.so.6.0 00:02:59.843 SYMLINK libspdk_bdev_iscsi.so 00:02:59.843 SO libspdk_bdev_malloc.so.6.0 00:02:59.843 SYMLINK libspdk_bdev_error.so 00:02:59.843 SYMLINK libspdk_bdev_null.so 00:02:59.843 SYMLINK libspdk_bdev_delay.so 00:02:59.843 SYMLINK libspdk_bdev_compress.so 00:02:59.843 SYMLINK libspdk_bdev_ftl.so 00:02:59.843 SYMLINK libspdk_bdev_aio.so 00:02:59.843 SYMLINK libspdk_bdev_zone_block.so 00:02:59.843 SYMLINK libspdk_bdev_crypto.so 00:02:59.843 SYMLINK libspdk_bdev_malloc.so 00:03:00.102 LIB libspdk_bdev_virtio.a 00:03:00.102 LIB libspdk_bdev_lvol.a 00:03:00.102 SO libspdk_bdev_virtio.so.6.0 00:03:00.102 SO libspdk_bdev_lvol.so.6.0 00:03:00.102 SYMLINK libspdk_bdev_virtio.so 00:03:00.102 SYMLINK libspdk_bdev_lvol.so 00:03:00.360 LIB libspdk_bdev_raid.a 00:03:00.360 SO libspdk_bdev_raid.so.6.0 00:03:00.360 SYMLINK libspdk_bdev_raid.so 00:03:01.740 LIB libspdk_bdev_nvme.a 00:03:01.740 SO libspdk_bdev_nvme.so.7.0 00:03:01.740 SYMLINK libspdk_bdev_nvme.so 00:03:02.676 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:02.676 CC module/event/subsystems/vmd/vmd.o 00:03:02.676 CC module/event/subsystems/scheduler/scheduler.o 00:03:02.676 CC module/event/subsystems/keyring/keyring.o 00:03:02.676 CC module/event/subsystems/sock/sock.o 00:03:02.676 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:02.676 CC module/event/subsystems/iobuf/iobuf.o 00:03:02.676 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:02.676 LIB libspdk_event_vhost_blk.a 00:03:02.676 LIB libspdk_event_sock.a 00:03:02.676 SO libspdk_event_vhost_blk.so.3.0 00:03:02.676 LIB libspdk_event_scheduler.a 00:03:02.676 LIB libspdk_event_vmd.a 00:03:02.676 LIB libspdk_event_keyring.a 00:03:02.676 LIB libspdk_event_iobuf.a 00:03:02.676 SO libspdk_event_scheduler.so.4.0 00:03:02.676 SO libspdk_event_sock.so.5.0 00:03:02.676 SO libspdk_event_keyring.so.1.0 00:03:02.676 SO libspdk_event_vmd.so.6.0 00:03:02.676 SO libspdk_event_iobuf.so.3.0 00:03:02.676 SYMLINK libspdk_event_vhost_blk.so 00:03:02.676 SYMLINK libspdk_event_scheduler.so 00:03:02.676 SYMLINK libspdk_event_sock.so 00:03:02.676 SYMLINK libspdk_event_keyring.so 00:03:02.676 SYMLINK libspdk_event_vmd.so 00:03:02.676 SYMLINK libspdk_event_iobuf.so 00:03:03.242 CC module/event/subsystems/accel/accel.o 00:03:03.242 LIB libspdk_event_accel.a 00:03:03.500 SO libspdk_event_accel.so.6.0 00:03:03.500 SYMLINK libspdk_event_accel.so 00:03:03.758 CC module/event/subsystems/bdev/bdev.o 00:03:04.016 LIB libspdk_event_bdev.a 00:03:04.016 SO libspdk_event_bdev.so.6.0 00:03:04.016 SYMLINK libspdk_event_bdev.so 00:03:04.582 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:04.582 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:04.582 CC module/event/subsystems/nbd/nbd.o 00:03:04.582 CC module/event/subsystems/scsi/scsi.o 00:03:04.582 CC module/event/subsystems/ublk/ublk.o 00:03:04.582 LIB libspdk_event_nbd.a 00:03:04.582 LIB libspdk_event_scsi.a 00:03:04.582 LIB libspdk_event_ublk.a 00:03:04.582 SO libspdk_event_nbd.so.6.0 00:03:04.839 SO libspdk_event_scsi.so.6.0 00:03:04.839 SO libspdk_event_ublk.so.3.0 00:03:04.839 LIB libspdk_event_nvmf.a 00:03:04.840 SYMLINK libspdk_event_nbd.so 00:03:04.840 SYMLINK libspdk_event_scsi.so 00:03:04.840 SO libspdk_event_nvmf.so.6.0 00:03:04.840 SYMLINK libspdk_event_ublk.so 00:03:04.840 SYMLINK libspdk_event_nvmf.so 00:03:05.097 CC module/event/subsystems/iscsi/iscsi.o 00:03:05.097 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:05.356 LIB libspdk_event_vhost_scsi.a 00:03:05.356 LIB libspdk_event_iscsi.a 00:03:05.356 SO libspdk_event_vhost_scsi.so.3.0 00:03:05.356 SO libspdk_event_iscsi.so.6.0 00:03:05.356 SYMLINK libspdk_event_vhost_scsi.so 00:03:05.356 SYMLINK libspdk_event_iscsi.so 00:03:05.615 SO libspdk.so.6.0 00:03:05.615 SYMLINK libspdk.so 00:03:05.873 TEST_HEADER include/spdk/assert.h 00:03:05.873 TEST_HEADER include/spdk/accel_module.h 00:03:05.873 TEST_HEADER include/spdk/accel.h 00:03:05.873 TEST_HEADER include/spdk/barrier.h 00:03:05.873 TEST_HEADER include/spdk/base64.h 00:03:05.873 TEST_HEADER include/spdk/bdev.h 00:03:05.873 TEST_HEADER include/spdk/bdev_module.h 00:03:05.873 TEST_HEADER include/spdk/bit_array.h 00:03:05.873 TEST_HEADER include/spdk/bdev_zone.h 00:03:05.873 TEST_HEADER include/spdk/bit_pool.h 00:03:05.873 TEST_HEADER include/spdk/blob_bdev.h 00:03:05.873 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:05.873 TEST_HEADER include/spdk/blobfs.h 00:03:05.873 TEST_HEADER include/spdk/blob.h 00:03:05.873 TEST_HEADER include/spdk/conf.h 00:03:05.873 CXX app/trace/trace.o 00:03:05.873 TEST_HEADER include/spdk/cpuset.h 00:03:05.873 TEST_HEADER include/spdk/crc16.h 00:03:05.873 TEST_HEADER include/spdk/crc32.h 00:03:05.873 TEST_HEADER include/spdk/config.h 00:03:05.873 TEST_HEADER include/spdk/crc64.h 00:03:05.873 TEST_HEADER include/spdk/dif.h 00:03:05.873 TEST_HEADER include/spdk/endian.h 00:03:05.873 TEST_HEADER include/spdk/env_dpdk.h 00:03:05.873 TEST_HEADER include/spdk/dma.h 00:03:05.873 CC test/rpc_client/rpc_client_test.o 00:03:05.873 CC app/trace_record/trace_record.o 00:03:05.873 TEST_HEADER include/spdk/env.h 00:03:05.873 TEST_HEADER include/spdk/event.h 00:03:05.873 TEST_HEADER include/spdk/fd_group.h 00:03:05.873 TEST_HEADER include/spdk/fd.h 00:03:05.873 TEST_HEADER include/spdk/file.h 00:03:05.873 TEST_HEADER include/spdk/ftl.h 00:03:05.873 TEST_HEADER include/spdk/gpt_spec.h 00:03:05.873 CC app/spdk_top/spdk_top.o 00:03:05.873 TEST_HEADER include/spdk/hexlify.h 00:03:05.873 TEST_HEADER include/spdk/histogram_data.h 00:03:05.873 TEST_HEADER include/spdk/idxd.h 00:03:05.873 TEST_HEADER include/spdk/idxd_spec.h 00:03:05.873 CC app/spdk_lspci/spdk_lspci.o 00:03:05.873 TEST_HEADER include/spdk/ioat.h 00:03:05.873 TEST_HEADER include/spdk/init.h 00:03:05.873 CC app/spdk_nvme_identify/identify.o 00:03:05.873 TEST_HEADER include/spdk/ioat_spec.h 00:03:05.873 TEST_HEADER include/spdk/iscsi_spec.h 00:03:05.873 TEST_HEADER include/spdk/json.h 00:03:05.873 TEST_HEADER include/spdk/jsonrpc.h 00:03:05.873 TEST_HEADER include/spdk/keyring.h 00:03:05.873 TEST_HEADER include/spdk/keyring_module.h 00:03:05.873 TEST_HEADER include/spdk/likely.h 00:03:05.873 TEST_HEADER include/spdk/log.h 00:03:05.873 TEST_HEADER include/spdk/lvol.h 00:03:05.873 TEST_HEADER include/spdk/memory.h 00:03:05.873 TEST_HEADER include/spdk/mmio.h 00:03:05.873 TEST_HEADER include/spdk/nbd.h 00:03:05.873 TEST_HEADER include/spdk/notify.h 00:03:05.873 TEST_HEADER include/spdk/nvme.h 00:03:05.873 TEST_HEADER include/spdk/nvme_intel.h 00:03:05.873 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:05.873 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:05.873 TEST_HEADER include/spdk/nvme_spec.h 00:03:05.873 CC app/spdk_nvme_perf/perf.o 00:03:05.873 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:05.873 TEST_HEADER include/spdk/nvme_zns.h 00:03:05.873 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:05.873 TEST_HEADER include/spdk/nvmf.h 00:03:06.143 TEST_HEADER include/spdk/nvmf_spec.h 00:03:06.143 TEST_HEADER include/spdk/nvmf_transport.h 00:03:06.143 CC app/spdk_nvme_discover/discovery_aer.o 00:03:06.143 TEST_HEADER include/spdk/opal.h 00:03:06.143 TEST_HEADER include/spdk/opal_spec.h 00:03:06.143 TEST_HEADER include/spdk/pci_ids.h 00:03:06.143 TEST_HEADER include/spdk/queue.h 00:03:06.143 TEST_HEADER include/spdk/pipe.h 00:03:06.143 TEST_HEADER include/spdk/reduce.h 00:03:06.143 TEST_HEADER include/spdk/rpc.h 00:03:06.143 TEST_HEADER include/spdk/scsi.h 00:03:06.143 TEST_HEADER include/spdk/scheduler.h 00:03:06.143 TEST_HEADER include/spdk/scsi_spec.h 00:03:06.143 TEST_HEADER include/spdk/sock.h 00:03:06.143 TEST_HEADER include/spdk/stdinc.h 00:03:06.143 TEST_HEADER include/spdk/string.h 00:03:06.143 TEST_HEADER include/spdk/thread.h 00:03:06.143 TEST_HEADER include/spdk/trace.h 00:03:06.143 TEST_HEADER include/spdk/trace_parser.h 00:03:06.143 TEST_HEADER include/spdk/tree.h 00:03:06.143 TEST_HEADER include/spdk/ublk.h 00:03:06.143 TEST_HEADER include/spdk/util.h 00:03:06.143 TEST_HEADER include/spdk/uuid.h 00:03:06.143 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:06.143 TEST_HEADER include/spdk/version.h 00:03:06.143 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:06.143 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:06.143 TEST_HEADER include/spdk/vhost.h 00:03:06.143 TEST_HEADER include/spdk/vmd.h 00:03:06.143 TEST_HEADER include/spdk/xor.h 00:03:06.143 TEST_HEADER include/spdk/zipf.h 00:03:06.143 CXX test/cpp_headers/accel_module.o 00:03:06.143 CXX test/cpp_headers/accel.o 00:03:06.143 CXX test/cpp_headers/barrier.o 00:03:06.143 CXX test/cpp_headers/assert.o 00:03:06.143 CXX test/cpp_headers/bdev.o 00:03:06.143 CXX test/cpp_headers/base64.o 00:03:06.143 CXX test/cpp_headers/bdev_module.o 00:03:06.143 CXX test/cpp_headers/bdev_zone.o 00:03:06.143 CXX test/cpp_headers/bit_pool.o 00:03:06.143 CXX test/cpp_headers/bit_array.o 00:03:06.143 CXX test/cpp_headers/blob_bdev.o 00:03:06.143 CXX test/cpp_headers/blobfs_bdev.o 00:03:06.143 CXX test/cpp_headers/blobfs.o 00:03:06.143 CXX test/cpp_headers/blob.o 00:03:06.143 CXX test/cpp_headers/conf.o 00:03:06.143 CXX test/cpp_headers/config.o 00:03:06.143 CC app/spdk_dd/spdk_dd.o 00:03:06.143 CXX test/cpp_headers/cpuset.o 00:03:06.143 CXX test/cpp_headers/crc16.o 00:03:06.143 CXX test/cpp_headers/crc32.o 00:03:06.143 CXX test/cpp_headers/crc64.o 00:03:06.143 CXX test/cpp_headers/dif.o 00:03:06.143 CXX test/cpp_headers/dma.o 00:03:06.143 CXX test/cpp_headers/endian.o 00:03:06.143 CXX test/cpp_headers/env_dpdk.o 00:03:06.143 CXX test/cpp_headers/env.o 00:03:06.143 CXX test/cpp_headers/event.o 00:03:06.143 CXX test/cpp_headers/fd_group.o 00:03:06.143 CC app/iscsi_tgt/iscsi_tgt.o 00:03:06.143 CXX test/cpp_headers/fd.o 00:03:06.143 CXX test/cpp_headers/file.o 00:03:06.143 CXX test/cpp_headers/ftl.o 00:03:06.143 CXX test/cpp_headers/gpt_spec.o 00:03:06.143 CXX test/cpp_headers/hexlify.o 00:03:06.143 CXX test/cpp_headers/histogram_data.o 00:03:06.143 CXX test/cpp_headers/idxd.o 00:03:06.143 CXX test/cpp_headers/idxd_spec.o 00:03:06.143 CXX test/cpp_headers/ioat.o 00:03:06.143 CXX test/cpp_headers/init.o 00:03:06.143 CXX test/cpp_headers/ioat_spec.o 00:03:06.143 CXX test/cpp_headers/jsonrpc.o 00:03:06.143 CXX test/cpp_headers/json.o 00:03:06.143 CXX test/cpp_headers/iscsi_spec.o 00:03:06.143 CXX test/cpp_headers/keyring.o 00:03:06.143 CC app/nvmf_tgt/nvmf_main.o 00:03:06.143 CC test/app/jsoncat/jsoncat.o 00:03:06.143 CC app/spdk_tgt/spdk_tgt.o 00:03:06.143 CXX test/cpp_headers/keyring_module.o 00:03:06.143 CC test/app/histogram_perf/histogram_perf.o 00:03:06.143 CC test/env/vtophys/vtophys.o 00:03:06.143 CC test/env/memory/memory_ut.o 00:03:06.143 CC test/app/stub/stub.o 00:03:06.143 CC test/thread/poller_perf/poller_perf.o 00:03:06.143 CC examples/ioat/perf/perf.o 00:03:06.143 CC examples/util/zipf/zipf.o 00:03:06.143 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:06.143 CC test/env/pci/pci_ut.o 00:03:06.143 CC examples/ioat/verify/verify.o 00:03:06.143 CC test/app/bdev_svc/bdev_svc.o 00:03:06.143 CC test/dma/test_dma/test_dma.o 00:03:06.143 CC app/fio/nvme/fio_plugin.o 00:03:06.402 CC app/fio/bdev/fio_plugin.o 00:03:06.402 LINK rpc_client_test 00:03:06.402 LINK spdk_lspci 00:03:06.402 LINK spdk_trace_record 00:03:06.402 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:06.402 LINK spdk_nvme_discover 00:03:06.402 CC test/env/mem_callbacks/mem_callbacks.o 00:03:06.402 LINK jsoncat 00:03:06.402 LINK interrupt_tgt 00:03:06.669 LINK poller_perf 00:03:06.669 CXX test/cpp_headers/likely.o 00:03:06.669 CXX test/cpp_headers/log.o 00:03:06.669 LINK vtophys 00:03:06.669 LINK iscsi_tgt 00:03:06.669 CXX test/cpp_headers/lvol.o 00:03:06.669 LINK ioat_perf 00:03:06.669 LINK nvmf_tgt 00:03:06.669 CXX test/cpp_headers/memory.o 00:03:06.669 LINK env_dpdk_post_init 00:03:06.669 CXX test/cpp_headers/mmio.o 00:03:06.669 CXX test/cpp_headers/nbd.o 00:03:06.669 CXX test/cpp_headers/notify.o 00:03:06.669 CXX test/cpp_headers/nvme.o 00:03:06.669 CXX test/cpp_headers/nvme_intel.o 00:03:06.669 CXX test/cpp_headers/nvme_ocssd.o 00:03:06.669 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:06.669 CXX test/cpp_headers/nvme_spec.o 00:03:06.669 LINK verify 00:03:06.669 LINK zipf 00:03:06.669 CXX test/cpp_headers/nvme_zns.o 00:03:06.669 LINK stub 00:03:06.669 LINK histogram_perf 00:03:06.669 CXX test/cpp_headers/nvmf_cmd.o 00:03:06.669 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:06.669 CXX test/cpp_headers/nvmf.o 00:03:06.669 CXX test/cpp_headers/nvmf_spec.o 00:03:06.669 CXX test/cpp_headers/nvmf_transport.o 00:03:06.669 CXX test/cpp_headers/opal.o 00:03:06.669 LINK bdev_svc 00:03:06.669 CXX test/cpp_headers/opal_spec.o 00:03:06.669 CXX test/cpp_headers/pci_ids.o 00:03:06.669 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:06.669 CXX test/cpp_headers/pipe.o 00:03:06.669 CXX test/cpp_headers/queue.o 00:03:06.669 CXX test/cpp_headers/reduce.o 00:03:06.669 CXX test/cpp_headers/rpc.o 00:03:06.669 CXX test/cpp_headers/scheduler.o 00:03:06.669 CXX test/cpp_headers/scsi.o 00:03:06.669 CXX test/cpp_headers/scsi_spec.o 00:03:06.669 CXX test/cpp_headers/sock.o 00:03:06.669 CXX test/cpp_headers/stdinc.o 00:03:06.669 CXX test/cpp_headers/string.o 00:03:06.669 CXX test/cpp_headers/thread.o 00:03:06.669 CXX test/cpp_headers/trace.o 00:03:06.669 CXX test/cpp_headers/trace_parser.o 00:03:06.669 CXX test/cpp_headers/tree.o 00:03:06.669 CXX test/cpp_headers/ublk.o 00:03:06.669 LINK spdk_tgt 00:03:06.669 CXX test/cpp_headers/util.o 00:03:06.669 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:06.928 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:06.928 CXX test/cpp_headers/uuid.o 00:03:06.928 CXX test/cpp_headers/version.o 00:03:06.928 CXX test/cpp_headers/vfio_user_pci.o 00:03:06.928 CXX test/cpp_headers/vfio_user_spec.o 00:03:06.928 CXX test/cpp_headers/vhost.o 00:03:06.928 CXX test/cpp_headers/vmd.o 00:03:06.928 CXX test/cpp_headers/xor.o 00:03:06.928 CXX test/cpp_headers/zipf.o 00:03:06.928 LINK test_dma 00:03:06.928 LINK spdk_dd 00:03:07.187 LINK pci_ut 00:03:07.187 LINK nvme_fuzz 00:03:07.187 CC test/event/reactor/reactor.o 00:03:07.187 CC test/event/event_perf/event_perf.o 00:03:07.187 CC test/event/reactor_perf/reactor_perf.o 00:03:07.187 CC test/event/app_repeat/app_repeat.o 00:03:07.187 LINK spdk_nvme_perf 00:03:07.187 CC examples/sock/hello_world/hello_sock.o 00:03:07.187 CC examples/vmd/lsvmd/lsvmd.o 00:03:07.187 CC test/event/scheduler/scheduler.o 00:03:07.187 LINK spdk_nvme 00:03:07.187 CC examples/vmd/led/led.o 00:03:07.187 CC examples/idxd/perf/perf.o 00:03:07.187 LINK mem_callbacks 00:03:07.446 CC examples/thread/thread/thread_ex.o 00:03:07.446 LINK vhost_fuzz 00:03:07.446 LINK spdk_bdev 00:03:07.446 LINK event_perf 00:03:07.446 LINK reactor 00:03:07.446 LINK reactor_perf 00:03:07.446 LINK spdk_top 00:03:07.446 LINK app_repeat 00:03:07.446 LINK lsvmd 00:03:07.446 LINK led 00:03:07.446 LINK spdk_nvme_identify 00:03:07.446 CC test/nvme/reset/reset.o 00:03:07.446 LINK hello_sock 00:03:07.446 CC test/nvme/e2edp/nvme_dp.o 00:03:07.446 CC test/nvme/sgl/sgl.o 00:03:07.446 CC test/nvme/err_injection/err_injection.o 00:03:07.446 CC test/nvme/cuse/cuse.o 00:03:07.446 CC test/nvme/overhead/overhead.o 00:03:07.446 CC test/nvme/boot_partition/boot_partition.o 00:03:07.446 CC test/nvme/fdp/fdp.o 00:03:07.446 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:07.446 CC test/nvme/startup/startup.o 00:03:07.446 CC test/nvme/simple_copy/simple_copy.o 00:03:07.446 CC test/nvme/compliance/nvme_compliance.o 00:03:07.446 CC test/nvme/fused_ordering/fused_ordering.o 00:03:07.446 CC test/nvme/aer/aer.o 00:03:07.446 CC test/nvme/reserve/reserve.o 00:03:07.446 CC test/nvme/connect_stress/connect_stress.o 00:03:07.446 CC test/accel/dif/dif.o 00:03:07.704 CC test/blobfs/mkfs/mkfs.o 00:03:07.704 LINK scheduler 00:03:07.704 LINK spdk_trace 00:03:07.704 CC test/lvol/esnap/esnap.o 00:03:07.704 LINK thread 00:03:07.704 LINK idxd_perf 00:03:07.704 LINK boot_partition 00:03:07.704 LINK fused_ordering 00:03:07.704 LINK reserve 00:03:07.704 LINK startup 00:03:07.704 LINK simple_copy 00:03:07.704 LINK connect_stress 00:03:07.704 LINK memory_ut 00:03:07.962 LINK doorbell_aers 00:03:07.962 LINK err_injection 00:03:07.962 LINK mkfs 00:03:07.962 LINK overhead 00:03:07.962 LINK nvme_dp 00:03:07.962 LINK reset 00:03:07.962 LINK sgl 00:03:07.962 LINK aer 00:03:07.962 LINK nvme_compliance 00:03:07.962 LINK fdp 00:03:07.962 CC examples/nvme/reconnect/reconnect.o 00:03:07.962 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:07.962 CC examples/nvme/abort/abort.o 00:03:07.962 CC examples/nvme/hotplug/hotplug.o 00:03:07.962 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:07.962 CC examples/nvme/arbitration/arbitration.o 00:03:07.962 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:07.962 CC examples/nvme/hello_world/hello_world.o 00:03:07.962 CC app/vhost/vhost.o 00:03:07.962 LINK dif 00:03:08.220 LINK pmr_persistence 00:03:08.220 LINK hotplug 00:03:08.220 LINK cmb_copy 00:03:08.220 LINK vhost 00:03:08.220 LINK hello_world 00:03:08.220 CC examples/blob/hello_world/hello_blob.o 00:03:08.220 CC examples/accel/perf/accel_perf.o 00:03:08.220 CC examples/blob/cli/blobcli.o 00:03:08.478 LINK reconnect 00:03:08.478 LINK arbitration 00:03:08.478 LINK abort 00:03:08.478 LINK nvme_manage 00:03:08.478 LINK hello_blob 00:03:08.736 LINK iscsi_fuzz 00:03:08.736 CC test/bdev/bdevio/bdevio.o 00:03:08.736 LINK accel_perf 00:03:08.994 LINK blobcli 00:03:08.994 LINK cuse 00:03:09.252 LINK bdevio 00:03:09.510 CC examples/bdev/bdevperf/bdevperf.o 00:03:09.510 CC examples/bdev/hello_world/hello_bdev.o 00:03:09.767 LINK hello_bdev 00:03:10.331 LINK bdevperf 00:03:10.927 CC examples/nvmf/nvmf/nvmf.o 00:03:11.191 LINK nvmf 00:03:12.561 LINK esnap 00:03:13.124 00:03:13.124 real 1m30.784s 00:03:13.124 user 17m23.874s 00:03:13.124 sys 4m12.231s 00:03:13.124 09:07:21 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:13.124 09:07:21 make -- common/autotest_common.sh@10 -- $ set +x 00:03:13.124 ************************************ 00:03:13.124 END TEST make 00:03:13.124 ************************************ 00:03:13.124 09:07:22 -- common/autotest_common.sh@1142 -- $ return 0 00:03:13.124 09:07:22 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:13.124 09:07:22 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:13.124 09:07:22 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:13.124 09:07:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.124 09:07:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:13.124 09:07:22 -- pm/common@44 -- $ pid=4121112 00:03:13.124 09:07:22 -- pm/common@50 -- $ kill -TERM 4121112 00:03:13.124 09:07:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.124 09:07:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:13.124 09:07:22 -- pm/common@44 -- $ pid=4121113 00:03:13.124 09:07:22 -- pm/common@50 -- $ kill -TERM 4121113 00:03:13.124 09:07:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.124 09:07:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:13.124 09:07:22 -- pm/common@44 -- $ pid=4121115 00:03:13.124 09:07:22 -- pm/common@50 -- $ kill -TERM 4121115 00:03:13.124 09:07:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.124 09:07:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:13.124 09:07:22 -- pm/common@44 -- $ pid=4121138 00:03:13.124 09:07:22 -- pm/common@50 -- $ sudo -E kill -TERM 4121138 00:03:13.381 09:07:22 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:13.381 09:07:22 -- nvmf/common.sh@7 -- # uname -s 00:03:13.381 09:07:22 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:13.381 09:07:22 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:13.381 09:07:22 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:13.381 09:07:22 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:13.381 09:07:22 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:13.381 09:07:22 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:13.381 09:07:22 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:13.381 09:07:22 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:13.381 09:07:22 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:13.381 09:07:22 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:13.381 09:07:22 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:13.381 09:07:22 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:13.381 09:07:22 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:13.381 09:07:22 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:13.381 09:07:22 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:13.381 09:07:22 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:13.381 09:07:22 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:13.381 09:07:22 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:13.381 09:07:22 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:13.381 09:07:22 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:13.381 09:07:22 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.381 09:07:22 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.381 09:07:22 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.381 09:07:22 -- paths/export.sh@5 -- # export PATH 00:03:13.381 09:07:22 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.381 09:07:22 -- nvmf/common.sh@47 -- # : 0 00:03:13.381 09:07:22 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:13.381 09:07:22 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:13.381 09:07:22 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:13.381 09:07:22 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:13.381 09:07:22 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:13.381 09:07:22 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:13.382 09:07:22 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:13.382 09:07:22 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:13.382 09:07:22 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:13.382 09:07:22 -- spdk/autotest.sh@32 -- # uname -s 00:03:13.382 09:07:22 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:13.382 09:07:22 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:13.382 09:07:22 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:13.382 09:07:22 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:13.382 09:07:22 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:13.382 09:07:22 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:13.382 09:07:22 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:13.382 09:07:22 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:13.382 09:07:22 -- spdk/autotest.sh@48 -- # udevadm_pid=4187939 00:03:13.382 09:07:22 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:13.382 09:07:22 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:13.382 09:07:22 -- pm/common@17 -- # local monitor 00:03:13.382 09:07:22 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.382 09:07:22 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.382 09:07:22 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.382 09:07:22 -- pm/common@21 -- # date +%s 00:03:13.382 09:07:22 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.382 09:07:22 -- pm/common@21 -- # date +%s 00:03:13.382 09:07:22 -- pm/common@25 -- # sleep 1 00:03:13.382 09:07:22 -- pm/common@21 -- # date +%s 00:03:13.382 09:07:22 -- pm/common@21 -- # date +%s 00:03:13.382 09:07:22 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721027242 00:03:13.382 09:07:22 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721027242 00:03:13.382 09:07:22 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721027242 00:03:13.382 09:07:22 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721027242 00:03:13.382 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721027242_collect-vmstat.pm.log 00:03:13.382 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721027242_collect-cpu-load.pm.log 00:03:13.382 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721027242_collect-cpu-temp.pm.log 00:03:13.382 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721027242_collect-bmc-pm.bmc.pm.log 00:03:14.314 09:07:23 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:14.314 09:07:23 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:14.314 09:07:23 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:14.314 09:07:23 -- common/autotest_common.sh@10 -- # set +x 00:03:14.314 09:07:23 -- spdk/autotest.sh@59 -- # create_test_list 00:03:14.314 09:07:23 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:14.314 09:07:23 -- common/autotest_common.sh@10 -- # set +x 00:03:14.314 09:07:23 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:14.571 09:07:23 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.571 09:07:23 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.571 09:07:23 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:14.571 09:07:23 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.571 09:07:23 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:14.571 09:07:23 -- common/autotest_common.sh@1455 -- # uname 00:03:14.571 09:07:23 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:14.571 09:07:23 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:14.571 09:07:23 -- common/autotest_common.sh@1475 -- # uname 00:03:14.571 09:07:23 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:14.571 09:07:23 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:14.571 09:07:23 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:14.571 09:07:23 -- spdk/autotest.sh@72 -- # hash lcov 00:03:14.571 09:07:23 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:14.571 09:07:23 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:14.571 --rc lcov_branch_coverage=1 00:03:14.571 --rc lcov_function_coverage=1 00:03:14.571 --rc genhtml_branch_coverage=1 00:03:14.571 --rc genhtml_function_coverage=1 00:03:14.571 --rc genhtml_legend=1 00:03:14.571 --rc geninfo_all_blocks=1 00:03:14.571 ' 00:03:14.571 09:07:23 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:14.571 --rc lcov_branch_coverage=1 00:03:14.571 --rc lcov_function_coverage=1 00:03:14.571 --rc genhtml_branch_coverage=1 00:03:14.571 --rc genhtml_function_coverage=1 00:03:14.571 --rc genhtml_legend=1 00:03:14.571 --rc geninfo_all_blocks=1 00:03:14.571 ' 00:03:14.571 09:07:23 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:14.571 --rc lcov_branch_coverage=1 00:03:14.571 --rc lcov_function_coverage=1 00:03:14.571 --rc genhtml_branch_coverage=1 00:03:14.571 --rc genhtml_function_coverage=1 00:03:14.571 --rc genhtml_legend=1 00:03:14.571 --rc geninfo_all_blocks=1 00:03:14.571 --no-external' 00:03:14.571 09:07:23 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:14.571 --rc lcov_branch_coverage=1 00:03:14.571 --rc lcov_function_coverage=1 00:03:14.571 --rc genhtml_branch_coverage=1 00:03:14.571 --rc genhtml_function_coverage=1 00:03:14.571 --rc genhtml_legend=1 00:03:14.571 --rc geninfo_all_blocks=1 00:03:14.571 --no-external' 00:03:14.571 09:07:23 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:14.571 lcov: LCOV version 1.14 00:03:14.571 09:07:23 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:19.854 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:19.854 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:20.112 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:20.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:20.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:20.113 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:20.371 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:20.371 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:20.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:20.630 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:42.548 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:42.548 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:49.115 09:07:58 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:49.115 09:07:58 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:49.115 09:07:58 -- common/autotest_common.sh@10 -- # set +x 00:03:49.115 09:07:58 -- spdk/autotest.sh@91 -- # rm -f 00:03:49.115 09:07:58 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:53.344 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:53.344 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:53.344 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:53.344 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:53.344 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:53.344 09:08:02 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:53.344 09:08:02 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:53.344 09:08:02 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:53.344 09:08:02 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:53.345 09:08:02 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:53.345 09:08:02 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:53.345 09:08:02 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:53.345 09:08:02 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:53.345 09:08:02 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:53.345 09:08:02 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:53.345 09:08:02 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:53.345 09:08:02 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:53.345 09:08:02 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:53.345 09:08:02 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:53.345 09:08:02 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:53.345 No valid GPT data, bailing 00:03:53.345 09:08:02 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:53.345 09:08:02 -- scripts/common.sh@391 -- # pt= 00:03:53.345 09:08:02 -- scripts/common.sh@392 -- # return 1 00:03:53.345 09:08:02 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:53.345 1+0 records in 00:03:53.345 1+0 records out 00:03:53.345 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00468772 s, 224 MB/s 00:03:53.345 09:08:02 -- spdk/autotest.sh@118 -- # sync 00:03:53.345 09:08:02 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:53.345 09:08:02 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:53.345 09:08:02 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:58.607 09:08:06 -- spdk/autotest.sh@124 -- # uname -s 00:03:58.607 09:08:06 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:58.607 09:08:06 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.607 09:08:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:58.607 09:08:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.607 09:08:06 -- common/autotest_common.sh@10 -- # set +x 00:03:58.607 ************************************ 00:03:58.607 START TEST setup.sh 00:03:58.607 ************************************ 00:03:58.607 09:08:06 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.607 * Looking for test storage... 00:03:58.607 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:58.607 09:08:06 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:58.607 09:08:06 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:58.607 09:08:06 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:58.607 09:08:06 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:58.607 09:08:06 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.607 09:08:06 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:58.607 ************************************ 00:03:58.607 START TEST acl 00:03:58.607 ************************************ 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:58.607 * Looking for test storage... 00:03:58.607 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:58.607 09:08:07 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:58.607 09:08:07 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:58.607 09:08:07 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.607 09:08:07 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.794 09:08:11 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:02.794 09:08:11 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:02.794 09:08:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:02.794 09:08:11 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:02.794 09:08:11 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.794 09:08:11 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 Hugepages 00:04:06.074 node hugesize free / total 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 00:04:06.074 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:06.074 09:08:14 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:06.075 09:08:14 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:06.075 09:08:14 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:06.075 09:08:14 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.075 09:08:14 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:06.075 ************************************ 00:04:06.075 START TEST denied 00:04:06.075 ************************************ 00:04:06.075 09:08:14 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:06.075 09:08:14 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:06.075 09:08:14 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:06.075 09:08:14 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:06.075 09:08:14 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.075 09:08:14 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:10.249 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:10.249 09:08:18 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:15.509 00:04:15.509 real 0m8.912s 00:04:15.509 user 0m3.018s 00:04:15.509 sys 0m5.187s 00:04:15.509 09:08:23 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:15.509 09:08:23 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:15.509 ************************************ 00:04:15.509 END TEST denied 00:04:15.509 ************************************ 00:04:15.509 09:08:23 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:15.509 09:08:23 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:15.509 09:08:23 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:15.509 09:08:23 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:15.509 09:08:23 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:15.509 ************************************ 00:04:15.509 START TEST allowed 00:04:15.509 ************************************ 00:04:15.509 09:08:23 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:15.509 09:08:23 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:15.509 09:08:23 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:15.509 09:08:23 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:15.509 09:08:23 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.509 09:08:23 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:22.060 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:22.060 09:08:30 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:22.060 09:08:30 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:22.060 09:08:30 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:22.060 09:08:30 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:22.060 09:08:30 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:25.375 00:04:25.375 real 0m10.406s 00:04:25.375 user 0m2.673s 00:04:25.375 sys 0m5.112s 00:04:25.375 09:08:34 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:25.375 09:08:34 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:25.375 ************************************ 00:04:25.375 END TEST allowed 00:04:25.375 ************************************ 00:04:25.375 09:08:34 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:25.375 00:04:25.375 real 0m27.118s 00:04:25.375 user 0m8.512s 00:04:25.375 sys 0m15.523s 00:04:25.375 09:08:34 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:25.375 09:08:34 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:25.375 ************************************ 00:04:25.375 END TEST acl 00:04:25.375 ************************************ 00:04:25.375 09:08:34 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:25.376 09:08:34 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:25.376 09:08:34 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:25.376 09:08:34 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:25.376 09:08:34 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:25.376 ************************************ 00:04:25.376 START TEST hugepages 00:04:25.376 ************************************ 00:04:25.376 09:08:34 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:25.376 * Looking for test storage... 00:04:25.376 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76805588 kB' 'MemAvailable: 80104380 kB' 'Buffers: 12176 kB' 'Cached: 9429396 kB' 'SwapCached: 0 kB' 'Active: 6484188 kB' 'Inactive: 3456260 kB' 'Active(anon): 6090604 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502688 kB' 'Mapped: 202036 kB' 'Shmem: 5591728 kB' 'KReclaimable: 205184 kB' 'Slab: 526080 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320896 kB' 'KernelStack: 16144 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7517480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.376 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.633 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:25.634 09:08:34 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:25.634 09:08:34 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:25.634 09:08:34 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:25.634 09:08:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:25.634 ************************************ 00:04:25.634 START TEST default_setup 00:04:25.634 ************************************ 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.634 09:08:34 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:28.917 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:28.917 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:28.917 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:28.917 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:28.917 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:28.917 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:28.917 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:29.174 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:31.701 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.701 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78930028 kB' 'MemAvailable: 82228820 kB' 'Buffers: 12176 kB' 'Cached: 9429508 kB' 'SwapCached: 0 kB' 'Active: 6501752 kB' 'Inactive: 3456260 kB' 'Active(anon): 6108168 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519228 kB' 'Mapped: 202208 kB' 'Shmem: 5591840 kB' 'KReclaimable: 205184 kB' 'Slab: 524788 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319604 kB' 'KernelStack: 16368 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7531916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201160 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.702 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.703 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78927792 kB' 'MemAvailable: 82226584 kB' 'Buffers: 12176 kB' 'Cached: 9429508 kB' 'SwapCached: 0 kB' 'Active: 6503556 kB' 'Inactive: 3456260 kB' 'Active(anon): 6109972 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521060 kB' 'Mapped: 202216 kB' 'Shmem: 5591840 kB' 'KReclaimable: 205184 kB' 'Slab: 524780 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319596 kB' 'KernelStack: 16736 kB' 'PageTables: 9636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7531936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201560 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.704 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.965 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.966 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78927064 kB' 'MemAvailable: 82225856 kB' 'Buffers: 12176 kB' 'Cached: 9429508 kB' 'SwapCached: 0 kB' 'Active: 6503660 kB' 'Inactive: 3456260 kB' 'Active(anon): 6110076 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521172 kB' 'Mapped: 202224 kB' 'Shmem: 5591840 kB' 'KReclaimable: 205184 kB' 'Slab: 525292 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320108 kB' 'KernelStack: 16960 kB' 'PageTables: 10068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7530468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201816 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.967 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.968 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:31.969 nr_hugepages=1024 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:31.969 resv_hugepages=0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:31.969 surplus_hugepages=0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:31.969 anon_hugepages=0 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78930400 kB' 'MemAvailable: 82229192 kB' 'Buffers: 12176 kB' 'Cached: 9429516 kB' 'SwapCached: 0 kB' 'Active: 6500984 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107400 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518360 kB' 'Mapped: 202096 kB' 'Shmem: 5591848 kB' 'KReclaimable: 205184 kB' 'Slab: 525332 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320148 kB' 'KernelStack: 16416 kB' 'PageTables: 8976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7531980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.969 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.970 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36560516 kB' 'MemUsed: 11556424 kB' 'SwapCached: 0 kB' 'Active: 5327904 kB' 'Inactive: 3372048 kB' 'Active(anon): 5170008 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8434704 kB' 'Mapped: 90332 kB' 'AnonPages: 268436 kB' 'Shmem: 4904760 kB' 'KernelStack: 9432 kB' 'PageTables: 4384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 329064 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 203052 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.971 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:31.972 node0=1024 expecting 1024 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:31.972 00:04:31.972 real 0m6.379s 00:04:31.972 user 0m1.491s 00:04:31.972 sys 0m2.560s 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.972 09:08:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:31.972 ************************************ 00:04:31.972 END TEST default_setup 00:04:31.972 ************************************ 00:04:31.972 09:08:40 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:31.972 09:08:40 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:31.972 09:08:40 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.972 09:08:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.972 09:08:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:31.972 ************************************ 00:04:31.972 START TEST per_node_1G_alloc 00:04:31.972 ************************************ 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.972 09:08:40 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:36.156 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:36.156 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:36.156 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:36.156 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.156 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.156 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929332 kB' 'MemAvailable: 82228124 kB' 'Buffers: 12176 kB' 'Cached: 9429652 kB' 'SwapCached: 0 kB' 'Active: 6500192 kB' 'Inactive: 3456260 kB' 'Active(anon): 6106608 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517472 kB' 'Mapped: 201012 kB' 'Shmem: 5591984 kB' 'KReclaimable: 205184 kB' 'Slab: 524680 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319496 kB' 'KernelStack: 16320 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7525844 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201176 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.157 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929864 kB' 'MemAvailable: 82228656 kB' 'Buffers: 12176 kB' 'Cached: 9429656 kB' 'SwapCached: 0 kB' 'Active: 6499580 kB' 'Inactive: 3456260 kB' 'Active(anon): 6105996 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516872 kB' 'Mapped: 200980 kB' 'Shmem: 5591988 kB' 'KReclaimable: 205184 kB' 'Slab: 524680 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319496 kB' 'KernelStack: 16240 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7527352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.158 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.159 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929444 kB' 'MemAvailable: 82228236 kB' 'Buffers: 12176 kB' 'Cached: 9429672 kB' 'SwapCached: 0 kB' 'Active: 6499616 kB' 'Inactive: 3456260 kB' 'Active(anon): 6106032 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517344 kB' 'Mapped: 200980 kB' 'Shmem: 5592004 kB' 'KReclaimable: 205184 kB' 'Slab: 524752 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319568 kB' 'KernelStack: 16336 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7527376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201064 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.160 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.161 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:36.162 nr_hugepages=1024 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:36.162 resv_hugepages=0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:36.162 surplus_hugepages=0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:36.162 anon_hugepages=0 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78930208 kB' 'MemAvailable: 82229000 kB' 'Buffers: 12176 kB' 'Cached: 9429672 kB' 'SwapCached: 0 kB' 'Active: 6499396 kB' 'Inactive: 3456260 kB' 'Active(anon): 6105812 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517124 kB' 'Mapped: 200980 kB' 'Shmem: 5592004 kB' 'KReclaimable: 205184 kB' 'Slab: 524752 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319568 kB' 'KernelStack: 16160 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7527396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.162 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.163 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37611856 kB' 'MemUsed: 10505084 kB' 'SwapCached: 0 kB' 'Active: 5328964 kB' 'Inactive: 3372048 kB' 'Active(anon): 5171068 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8434848 kB' 'Mapped: 90036 kB' 'AnonPages: 269440 kB' 'Shmem: 4904904 kB' 'KernelStack: 9416 kB' 'PageTables: 4400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 328868 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 202856 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.164 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41318764 kB' 'MemUsed: 2857768 kB' 'SwapCached: 0 kB' 'Active: 1170976 kB' 'Inactive: 84212 kB' 'Active(anon): 935288 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1007048 kB' 'Mapped: 110944 kB' 'AnonPages: 248168 kB' 'Shmem: 687148 kB' 'KernelStack: 6840 kB' 'PageTables: 4072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79172 kB' 'Slab: 195884 kB' 'SReclaimable: 79172 kB' 'SUnreclaim: 116712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.165 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:36.166 node0=512 expecting 512 00:04:36.166 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:36.167 node1=512 expecting 512 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:36.167 00:04:36.167 real 0m3.940s 00:04:36.167 user 0m1.521s 00:04:36.167 sys 0m2.526s 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:36.167 09:08:44 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:36.167 ************************************ 00:04:36.167 END TEST per_node_1G_alloc 00:04:36.167 ************************************ 00:04:36.167 09:08:44 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:36.167 09:08:44 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:36.167 09:08:44 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:36.167 09:08:44 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.167 09:08:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:36.167 ************************************ 00:04:36.167 START TEST even_2G_alloc 00:04:36.167 ************************************ 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.167 09:08:44 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:39.454 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:39.454 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:39.454 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:39.454 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.454 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929116 kB' 'MemAvailable: 82227908 kB' 'Buffers: 12176 kB' 'Cached: 9429804 kB' 'SwapCached: 0 kB' 'Active: 6500640 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107056 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517748 kB' 'Mapped: 201168 kB' 'Shmem: 5592136 kB' 'KReclaimable: 205184 kB' 'Slab: 524400 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319216 kB' 'KernelStack: 16272 kB' 'PageTables: 8416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7525296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.454 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.455 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929644 kB' 'MemAvailable: 82228436 kB' 'Buffers: 12176 kB' 'Cached: 9429808 kB' 'SwapCached: 0 kB' 'Active: 6499868 kB' 'Inactive: 3456260 kB' 'Active(anon): 6106284 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517468 kB' 'Mapped: 201060 kB' 'Shmem: 5592140 kB' 'KReclaimable: 205184 kB' 'Slab: 524368 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319184 kB' 'KernelStack: 16272 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7525312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.456 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.715 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78929600 kB' 'MemAvailable: 82228392 kB' 'Buffers: 12176 kB' 'Cached: 9429840 kB' 'SwapCached: 0 kB' 'Active: 6499900 kB' 'Inactive: 3456260 kB' 'Active(anon): 6106316 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517472 kB' 'Mapped: 201060 kB' 'Shmem: 5592172 kB' 'KReclaimable: 205184 kB' 'Slab: 524368 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319184 kB' 'KernelStack: 16272 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7525332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.716 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:39.717 nr_hugepages=1024 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:39.717 resv_hugepages=0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:39.717 surplus_hugepages=0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:39.717 anon_hugepages=0 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78928960 kB' 'MemAvailable: 82227752 kB' 'Buffers: 12176 kB' 'Cached: 9429844 kB' 'SwapCached: 0 kB' 'Active: 6499600 kB' 'Inactive: 3456260 kB' 'Active(anon): 6106016 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517168 kB' 'Mapped: 201060 kB' 'Shmem: 5592176 kB' 'KReclaimable: 205184 kB' 'Slab: 524368 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319184 kB' 'KernelStack: 16272 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7525356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.717 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37603360 kB' 'MemUsed: 10513580 kB' 'SwapCached: 0 kB' 'Active: 5328768 kB' 'Inactive: 3372048 kB' 'Active(anon): 5170872 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8434984 kB' 'Mapped: 90104 kB' 'AnonPages: 268992 kB' 'Shmem: 4905040 kB' 'KernelStack: 9384 kB' 'PageTables: 4224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 328652 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 202640 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.718 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41325768 kB' 'MemUsed: 2850764 kB' 'SwapCached: 0 kB' 'Active: 1170924 kB' 'Inactive: 84212 kB' 'Active(anon): 935236 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1007060 kB' 'Mapped: 110956 kB' 'AnonPages: 248212 kB' 'Shmem: 687160 kB' 'KernelStack: 6776 kB' 'PageTables: 3872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79172 kB' 'Slab: 195716 kB' 'SReclaimable: 79172 kB' 'SUnreclaim: 116544 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:39.719 node0=512 expecting 512 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:39.719 node1=512 expecting 512 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:39.719 00:04:39.719 real 0m3.654s 00:04:39.719 user 0m1.441s 00:04:39.719 sys 0m2.298s 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:39.719 09:08:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:39.719 ************************************ 00:04:39.719 END TEST even_2G_alloc 00:04:39.719 ************************************ 00:04:39.719 09:08:48 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:39.719 09:08:48 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:39.719 09:08:48 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:39.719 09:08:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:39.719 09:08:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:39.719 ************************************ 00:04:39.719 START TEST odd_alloc 00:04:39.719 ************************************ 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.719 09:08:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:43.063 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:43.063 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:43.063 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:43.063 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.063 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78932548 kB' 'MemAvailable: 82231340 kB' 'Buffers: 12176 kB' 'Cached: 9429952 kB' 'SwapCached: 0 kB' 'Active: 6501232 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107648 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518588 kB' 'Mapped: 201124 kB' 'Shmem: 5592284 kB' 'KReclaimable: 205184 kB' 'Slab: 523740 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 318556 kB' 'KernelStack: 16160 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7525828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.337 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.338 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78933304 kB' 'MemAvailable: 82232096 kB' 'Buffers: 12176 kB' 'Cached: 9429956 kB' 'SwapCached: 0 kB' 'Active: 6500928 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107344 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518300 kB' 'Mapped: 201072 kB' 'Shmem: 5592288 kB' 'KReclaimable: 205184 kB' 'Slab: 523784 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 318600 kB' 'KernelStack: 16160 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7525844 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.339 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78934124 kB' 'MemAvailable: 82232916 kB' 'Buffers: 12176 kB' 'Cached: 9429972 kB' 'SwapCached: 0 kB' 'Active: 6500956 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107372 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518308 kB' 'Mapped: 201072 kB' 'Shmem: 5592304 kB' 'KReclaimable: 205184 kB' 'Slab: 523784 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 318600 kB' 'KernelStack: 16160 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7525864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.340 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.341 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:43.342 nr_hugepages=1025 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:43.342 resv_hugepages=0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:43.342 surplus_hugepages=0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:43.342 anon_hugepages=0 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78934160 kB' 'MemAvailable: 82232952 kB' 'Buffers: 12176 kB' 'Cached: 9429972 kB' 'SwapCached: 0 kB' 'Active: 6500964 kB' 'Inactive: 3456260 kB' 'Active(anon): 6107380 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518272 kB' 'Mapped: 201072 kB' 'Shmem: 5592304 kB' 'KReclaimable: 205184 kB' 'Slab: 523784 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 318600 kB' 'KernelStack: 16192 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7525884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.342 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.343 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37605216 kB' 'MemUsed: 10511724 kB' 'SwapCached: 0 kB' 'Active: 5327956 kB' 'Inactive: 3372048 kB' 'Active(anon): 5170060 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8435120 kB' 'Mapped: 90104 kB' 'AnonPages: 268008 kB' 'Shmem: 4905176 kB' 'KernelStack: 9352 kB' 'PageTables: 4128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 328376 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 202364 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.344 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.345 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41326476 kB' 'MemUsed: 2850056 kB' 'SwapCached: 0 kB' 'Active: 1173296 kB' 'Inactive: 84212 kB' 'Active(anon): 937608 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1007088 kB' 'Mapped: 110968 kB' 'AnonPages: 250492 kB' 'Shmem: 687188 kB' 'KernelStack: 6792 kB' 'PageTables: 3928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79172 kB' 'Slab: 195408 kB' 'SReclaimable: 79172 kB' 'SUnreclaim: 116236 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.346 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:43.347 node0=512 expecting 513 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:43.347 node1=513 expecting 512 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:43.347 00:04:43.347 real 0m3.613s 00:04:43.347 user 0m1.392s 00:04:43.347 sys 0m2.281s 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:43.347 09:08:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:43.347 ************************************ 00:04:43.347 END TEST odd_alloc 00:04:43.347 ************************************ 00:04:43.347 09:08:52 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:43.347 09:08:52 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:43.347 09:08:52 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:43.347 09:08:52 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.347 09:08:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:43.605 ************************************ 00:04:43.605 START TEST custom_alloc 00:04:43.605 ************************************ 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.605 09:08:52 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:46.905 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:46.905 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:46.905 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:46.905 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.905 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.905 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77869212 kB' 'MemAvailable: 81168004 kB' 'Buffers: 12176 kB' 'Cached: 9430104 kB' 'SwapCached: 0 kB' 'Active: 6502232 kB' 'Inactive: 3456260 kB' 'Active(anon): 6108648 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518836 kB' 'Mapped: 201180 kB' 'Shmem: 5592436 kB' 'KReclaimable: 205184 kB' 'Slab: 525228 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320044 kB' 'KernelStack: 16336 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7525996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201064 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.906 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77870496 kB' 'MemAvailable: 81169288 kB' 'Buffers: 12176 kB' 'Cached: 9430116 kB' 'SwapCached: 0 kB' 'Active: 6501636 kB' 'Inactive: 3456260 kB' 'Active(anon): 6108052 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518788 kB' 'Mapped: 201080 kB' 'Shmem: 5592448 kB' 'KReclaimable: 205184 kB' 'Slab: 525132 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319948 kB' 'KernelStack: 16176 kB' 'PageTables: 8152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7526516 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.907 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.908 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77869424 kB' 'MemAvailable: 81168216 kB' 'Buffers: 12176 kB' 'Cached: 9430132 kB' 'SwapCached: 0 kB' 'Active: 6501656 kB' 'Inactive: 3456260 kB' 'Active(anon): 6108072 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518856 kB' 'Mapped: 201080 kB' 'Shmem: 5592464 kB' 'KReclaimable: 205184 kB' 'Slab: 525132 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319948 kB' 'KernelStack: 16192 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7526536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.909 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.910 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:46.911 nr_hugepages=1536 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:46.911 resv_hugepages=0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:46.911 surplus_hugepages=0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:46.911 anon_hugepages=0 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:46.911 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.170 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77872208 kB' 'MemAvailable: 81171000 kB' 'Buffers: 12176 kB' 'Cached: 9430156 kB' 'SwapCached: 0 kB' 'Active: 6501692 kB' 'Inactive: 3456260 kB' 'Active(anon): 6108108 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518840 kB' 'Mapped: 201080 kB' 'Shmem: 5592488 kB' 'KReclaimable: 205184 kB' 'Slab: 525132 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319948 kB' 'KernelStack: 16176 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7526560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.171 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37594940 kB' 'MemUsed: 10522000 kB' 'SwapCached: 0 kB' 'Active: 5328232 kB' 'Inactive: 3372048 kB' 'Active(anon): 5170336 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8435240 kB' 'Mapped: 90104 kB' 'AnonPages: 268204 kB' 'Shmem: 4905296 kB' 'KernelStack: 9416 kB' 'PageTables: 4260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 329388 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 203376 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.172 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40277268 kB' 'MemUsed: 3899264 kB' 'SwapCached: 0 kB' 'Active: 1173116 kB' 'Inactive: 84212 kB' 'Active(anon): 937428 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1007128 kB' 'Mapped: 110976 kB' 'AnonPages: 250252 kB' 'Shmem: 687228 kB' 'KernelStack: 6760 kB' 'PageTables: 3864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79172 kB' 'Slab: 195744 kB' 'SReclaimable: 79172 kB' 'SUnreclaim: 116572 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.173 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:47.174 node0=512 expecting 512 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:47.174 node1=1024 expecting 1024 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:47.174 00:04:47.174 real 0m3.640s 00:04:47.174 user 0m1.390s 00:04:47.174 sys 0m2.323s 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:47.174 09:08:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:47.174 ************************************ 00:04:47.174 END TEST custom_alloc 00:04:47.174 ************************************ 00:04:47.174 09:08:55 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:47.174 09:08:55 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:47.174 09:08:55 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:47.174 09:08:55 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:47.174 09:08:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:47.174 ************************************ 00:04:47.174 START TEST no_shrink_alloc 00:04:47.174 ************************************ 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.174 09:08:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:50.457 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:50.457 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:50.457 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:50.457 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.457 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:50.457 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78896864 kB' 'MemAvailable: 82195656 kB' 'Buffers: 12176 kB' 'Cached: 9430256 kB' 'SwapCached: 0 kB' 'Active: 6503376 kB' 'Inactive: 3456260 kB' 'Active(anon): 6109792 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520428 kB' 'Mapped: 201604 kB' 'Shmem: 5592588 kB' 'KReclaimable: 205184 kB' 'Slab: 525356 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16160 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7528944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.718 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78898156 kB' 'MemAvailable: 82196948 kB' 'Buffers: 12176 kB' 'Cached: 9430260 kB' 'SwapCached: 0 kB' 'Active: 6503868 kB' 'Inactive: 3456260 kB' 'Active(anon): 6110284 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521052 kB' 'Mapped: 201604 kB' 'Shmem: 5592592 kB' 'KReclaimable: 205184 kB' 'Slab: 525356 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16208 kB' 'PageTables: 8228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7528596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.719 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78898856 kB' 'MemAvailable: 82197648 kB' 'Buffers: 12176 kB' 'Cached: 9430264 kB' 'SwapCached: 0 kB' 'Active: 6504868 kB' 'Inactive: 3456260 kB' 'Active(anon): 6111284 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522168 kB' 'Mapped: 201604 kB' 'Shmem: 5592596 kB' 'KReclaimable: 205184 kB' 'Slab: 525356 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16256 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7530476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201096 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.720 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:50.721 nr_hugepages=1024 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:50.721 resv_hugepages=0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:50.721 surplus_hugepages=0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:50.721 anon_hugepages=0 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.721 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78903204 kB' 'MemAvailable: 82201996 kB' 'Buffers: 12176 kB' 'Cached: 9430264 kB' 'SwapCached: 0 kB' 'Active: 6506680 kB' 'Inactive: 3456260 kB' 'Active(anon): 6113096 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524032 kB' 'Mapped: 201884 kB' 'Shmem: 5592596 kB' 'KReclaimable: 205184 kB' 'Slab: 525356 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16496 kB' 'PageTables: 8852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.722 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36527412 kB' 'MemUsed: 11589528 kB' 'SwapCached: 0 kB' 'Active: 5334420 kB' 'Inactive: 3372048 kB' 'Active(anon): 5176524 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8435340 kB' 'Mapped: 90828 kB' 'AnonPages: 274344 kB' 'Shmem: 4905396 kB' 'KernelStack: 9368 kB' 'PageTables: 4204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 329284 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 203272 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:50.723 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:50.723 node0=1024 expecting 1024 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.724 09:08:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:54.919 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:54.919 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:54.919 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:54.919 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.919 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.919 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78833220 kB' 'MemAvailable: 82132012 kB' 'Buffers: 12176 kB' 'Cached: 9430388 kB' 'SwapCached: 0 kB' 'Active: 6510208 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116624 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527240 kB' 'Mapped: 201964 kB' 'Shmem: 5592720 kB' 'KReclaimable: 205184 kB' 'Slab: 525136 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 319952 kB' 'KernelStack: 16240 kB' 'PageTables: 8360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200988 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.919 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.920 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78834356 kB' 'MemAvailable: 82133148 kB' 'Buffers: 12176 kB' 'Cached: 9430388 kB' 'SwapCached: 0 kB' 'Active: 6509884 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116300 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526940 kB' 'Mapped: 201916 kB' 'Shmem: 5592720 kB' 'KReclaimable: 205184 kB' 'Slab: 525212 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320028 kB' 'KernelStack: 16240 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200972 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.921 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.922 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78834916 kB' 'MemAvailable: 82133708 kB' 'Buffers: 12176 kB' 'Cached: 9430408 kB' 'SwapCached: 0 kB' 'Active: 6509872 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116288 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526928 kB' 'Mapped: 201916 kB' 'Shmem: 5592740 kB' 'KReclaimable: 205184 kB' 'Slab: 525212 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320028 kB' 'KernelStack: 16240 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200972 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.923 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.924 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:54.925 nr_hugepages=1024 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:54.925 resv_hugepages=0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:54.925 surplus_hugepages=0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:54.925 anon_hugepages=0 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78834412 kB' 'MemAvailable: 82133204 kB' 'Buffers: 12176 kB' 'Cached: 9430428 kB' 'SwapCached: 0 kB' 'Active: 6509844 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116260 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526852 kB' 'Mapped: 201916 kB' 'Shmem: 5592760 kB' 'KReclaimable: 205184 kB' 'Slab: 525212 kB' 'SReclaimable: 205184 kB' 'SUnreclaim: 320028 kB' 'KernelStack: 16224 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200972 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 753060 kB' 'DirectMap2M: 13602816 kB' 'DirectMap1G: 87031808 kB' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.925 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.926 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36491708 kB' 'MemUsed: 11625232 kB' 'SwapCached: 0 kB' 'Active: 5329616 kB' 'Inactive: 3372048 kB' 'Active(anon): 5171720 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8435448 kB' 'Mapped: 90772 kB' 'AnonPages: 269344 kB' 'Shmem: 4905504 kB' 'KernelStack: 9384 kB' 'PageTables: 4284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126012 kB' 'Slab: 329052 kB' 'SReclaimable: 126012 kB' 'SUnreclaim: 203040 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.927 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:54.928 node0=1024 expecting 1024 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:54.928 00:04:54.928 real 0m7.307s 00:04:54.928 user 0m2.859s 00:04:54.928 sys 0m4.639s 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:54.928 09:09:03 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:54.928 ************************************ 00:04:54.928 END TEST no_shrink_alloc 00:04:54.928 ************************************ 00:04:54.928 09:09:03 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:54.928 09:09:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:54.928 00:04:54.928 real 0m29.191s 00:04:54.928 user 0m10.361s 00:04:54.928 sys 0m17.058s 00:04:54.928 09:09:03 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:54.928 09:09:03 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.928 ************************************ 00:04:54.928 END TEST hugepages 00:04:54.928 ************************************ 00:04:54.928 09:09:03 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:54.928 09:09:03 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:54.928 09:09:03 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:54.928 09:09:03 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:54.928 09:09:03 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:54.928 ************************************ 00:04:54.928 START TEST driver 00:04:54.928 ************************************ 00:04:54.928 09:09:03 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:54.928 * Looking for test storage... 00:04:54.928 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:54.928 09:09:03 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:54.928 09:09:03 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:54.928 09:09:03 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:59.118 09:09:08 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:59.118 09:09:08 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:59.118 09:09:08 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.118 09:09:08 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:59.118 ************************************ 00:04:59.118 START TEST guess_driver 00:04:59.118 ************************************ 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:59.118 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:59.377 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:59.377 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:59.377 Looking for driver=vfio-pci 00:04:59.378 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:59.378 09:09:08 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:59.378 09:09:08 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.378 09:09:08 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.568 09:09:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:06.108 09:09:14 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:11.455 00:05:11.455 real 0m11.473s 00:05:11.455 user 0m3.002s 00:05:11.455 sys 0m5.407s 00:05:11.455 09:09:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:11.455 09:09:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:11.455 ************************************ 00:05:11.455 END TEST guess_driver 00:05:11.455 ************************************ 00:05:11.455 09:09:19 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:11.455 00:05:11.455 real 0m16.093s 00:05:11.455 user 0m4.164s 00:05:11.455 sys 0m7.901s 00:05:11.455 09:09:19 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:11.455 09:09:19 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:11.455 ************************************ 00:05:11.455 END TEST driver 00:05:11.455 ************************************ 00:05:11.455 09:09:19 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:11.455 09:09:19 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:11.455 09:09:19 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:11.455 09:09:19 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:11.455 09:09:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:11.455 ************************************ 00:05:11.455 START TEST devices 00:05:11.455 ************************************ 00:05:11.455 09:09:19 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:11.455 * Looking for test storage... 00:05:11.455 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:11.455 09:09:19 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:11.455 09:09:19 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:11.455 09:09:19 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:11.455 09:09:19 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:15.659 09:09:23 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:15.659 No valid GPT data, bailing 00:05:15.659 09:09:23 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:15.659 09:09:23 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:15.659 09:09:23 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.659 09:09:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:15.659 ************************************ 00:05:15.659 START TEST nvme_mount 00:05:15.659 ************************************ 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:15.659 09:09:23 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:16.225 Creating new GPT entries in memory. 00:05:16.225 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:16.225 other utilities. 00:05:16.225 09:09:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:16.225 09:09:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:16.225 09:09:24 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:16.225 09:09:24 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:16.225 09:09:24 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:17.159 Creating new GPT entries in memory. 00:05:17.159 The operation has completed successfully. 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 27713 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:17.159 09:09:25 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.159 09:09:26 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.470 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.726 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:20.727 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:20.727 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:20.984 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:20.984 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:20.984 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:20.984 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:20.984 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.242 09:09:29 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:24.525 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.781 09:09:33 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:28.059 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:28.060 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:28.060 00:05:28.060 real 0m13.085s 00:05:28.060 user 0m3.861s 00:05:28.060 sys 0m7.026s 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.060 09:09:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:28.060 ************************************ 00:05:28.060 END TEST nvme_mount 00:05:28.060 ************************************ 00:05:28.318 09:09:37 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:28.318 09:09:37 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:28.318 09:09:37 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.318 09:09:37 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.318 09:09:37 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:28.318 ************************************ 00:05:28.318 START TEST dm_mount 00:05:28.318 ************************************ 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:28.318 09:09:37 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:29.251 Creating new GPT entries in memory. 00:05:29.251 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:29.251 other utilities. 00:05:29.251 09:09:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:29.251 09:09:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:29.251 09:09:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:29.251 09:09:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:29.251 09:09:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:30.194 Creating new GPT entries in memory. 00:05:30.194 The operation has completed successfully. 00:05:30.194 09:09:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:30.194 09:09:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:30.194 09:09:39 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:30.194 09:09:39 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:30.194 09:09:39 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:31.218 The operation has completed successfully. 00:05:31.218 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:31.218 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:31.218 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 31968 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.477 09:09:40 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.763 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.764 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.764 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:35.020 09:09:43 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:38.294 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:38.295 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:38.295 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:38.295 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.295 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:38.552 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:38.552 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:38.552 09:09:47 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:38.552 00:05:38.552 real 0m10.200s 00:05:38.552 user 0m2.361s 00:05:38.552 sys 0m4.753s 00:05:38.552 09:09:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.552 09:09:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:38.553 ************************************ 00:05:38.553 END TEST dm_mount 00:05:38.553 ************************************ 00:05:38.553 09:09:47 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.553 09:09:47 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:38.810 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:38.810 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:38.810 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:38.810 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.810 09:09:47 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:38.810 00:05:38.810 real 0m27.948s 00:05:38.810 user 0m7.785s 00:05:38.810 sys 0m14.801s 00:05:38.810 09:09:47 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.810 09:09:47 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:38.810 ************************************ 00:05:38.810 END TEST devices 00:05:38.810 ************************************ 00:05:38.810 09:09:47 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:38.810 00:05:38.810 real 1m40.802s 00:05:38.810 user 0m30.990s 00:05:38.810 sys 0m55.606s 00:05:38.810 09:09:47 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.810 09:09:47 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:38.810 ************************************ 00:05:38.810 END TEST setup.sh 00:05:38.810 ************************************ 00:05:38.810 09:09:47 -- common/autotest_common.sh@1142 -- # return 0 00:05:38.810 09:09:47 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:42.988 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:42.989 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:42.989 Hugepages 00:05:42.989 node hugesize free / total 00:05:42.989 node0 1048576kB 0 / 0 00:05:42.989 node0 2048kB 1024 / 1024 00:05:42.989 node1 1048576kB 0 / 0 00:05:42.989 node1 2048kB 1024 / 1024 00:05:42.989 00:05:42.989 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:42.989 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:42.989 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:42.989 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:42.989 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:42.989 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:42.989 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:42.989 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:42.989 09:09:51 -- spdk/autotest.sh@130 -- # uname -s 00:05:42.989 09:09:51 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:42.989 09:09:51 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:42.989 09:09:51 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:46.267 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:46.267 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:46.267 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:46.267 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:46.525 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:49.055 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:49.055 09:09:57 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:49.991 09:09:58 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:49.991 09:09:58 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:49.991 09:09:58 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:49.991 09:09:58 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:49.991 09:09:58 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:49.991 09:09:58 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:49.991 09:09:58 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:49.991 09:09:58 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:49.991 09:09:58 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:50.249 09:09:58 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:50.249 09:09:58 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:50.249 09:09:58 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:53.557 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:53.557 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:53.557 Waiting for block devices as requested 00:05:53.557 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:53.557 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:53.815 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:53.815 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:53.815 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:54.072 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:54.072 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:54.072 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:54.330 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:54.330 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:54.330 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:54.588 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:54.588 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:54.588 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:54.845 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:54.845 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:54.845 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:55.103 09:10:03 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:55.103 09:10:03 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:55.103 09:10:03 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:55.103 09:10:03 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:55.103 09:10:03 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:55.103 09:10:03 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:55.103 09:10:03 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:55.103 09:10:03 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:55.103 09:10:03 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:55.103 09:10:03 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:55.103 09:10:03 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:55.103 09:10:03 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:55.103 09:10:03 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:55.103 09:10:03 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:55.103 09:10:03 -- common/autotest_common.sh@1557 -- # continue 00:05:55.103 09:10:03 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:55.103 09:10:03 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:55.103 09:10:03 -- common/autotest_common.sh@10 -- # set +x 00:05:55.103 09:10:03 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:55.103 09:10:03 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:55.103 09:10:03 -- common/autotest_common.sh@10 -- # set +x 00:05:55.104 09:10:03 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:58.386 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:58.386 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:58.386 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.645 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.903 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:01.436 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:01.436 09:10:10 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:01.436 09:10:10 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:01.436 09:10:10 -- common/autotest_common.sh@10 -- # set +x 00:06:01.436 09:10:10 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:01.436 09:10:10 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:01.436 09:10:10 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:01.436 09:10:10 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:01.436 09:10:10 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:01.436 09:10:10 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:01.436 09:10:10 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:01.436 09:10:10 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:01.436 09:10:10 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:01.436 09:10:10 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:01.436 09:10:10 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:01.692 09:10:10 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:01.692 09:10:10 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:01.692 09:10:10 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:01.692 09:10:10 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:01.692 09:10:10 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:01.692 09:10:10 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:01.692 09:10:10 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:01.692 09:10:10 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:01.692 09:10:10 -- common/autotest_common.sh@1593 -- # return 0 00:06:01.692 09:10:10 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:01.692 09:10:10 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:01.692 09:10:10 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:01.692 09:10:10 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:01.692 09:10:10 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:02.258 Restarting all devices. 00:06:06.440 lstat() error: No such file or directory 00:06:06.440 QAT Error: No GENERAL section found 00:06:06.440 Failed to configure qat_dev0 00:06:06.440 lstat() error: No such file or directory 00:06:06.440 QAT Error: No GENERAL section found 00:06:06.440 Failed to configure qat_dev1 00:06:06.440 lstat() error: No such file or directory 00:06:06.440 QAT Error: No GENERAL section found 00:06:06.440 Failed to configure qat_dev2 00:06:06.440 enable sriov 00:06:06.440 Checking status of all devices. 00:06:06.440 There is 3 QAT acceleration device(s) in the system: 00:06:06.440 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:06.440 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:06.440 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:07.389 0000:3d:00.0 set to 16 VFs 00:06:08.765 0000:3f:00.0 set to 16 VFs 00:06:10.141 0000:da:00.0 set to 16 VFs 00:06:13.430 Properly configured the qat device with driver uio_pci_generic. 00:06:13.430 09:10:22 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:13.430 09:10:22 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:13.430 09:10:22 -- common/autotest_common.sh@10 -- # set +x 00:06:13.430 09:10:22 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:13.430 09:10:22 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:13.430 09:10:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:13.430 09:10:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.430 09:10:22 -- common/autotest_common.sh@10 -- # set +x 00:06:13.430 ************************************ 00:06:13.430 START TEST env 00:06:13.430 ************************************ 00:06:13.430 09:10:22 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:13.430 * Looking for test storage... 00:06:13.430 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:13.430 09:10:22 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:13.430 09:10:22 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:13.430 09:10:22 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.430 09:10:22 env -- common/autotest_common.sh@10 -- # set +x 00:06:13.430 ************************************ 00:06:13.430 START TEST env_memory 00:06:13.430 ************************************ 00:06:13.430 09:10:22 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:13.430 00:06:13.430 00:06:13.430 CUnit - A unit testing framework for C - Version 2.1-3 00:06:13.430 http://cunit.sourceforge.net/ 00:06:13.430 00:06:13.430 00:06:13.430 Suite: memory 00:06:13.430 Test: alloc and free memory map ...[2024-07-15 09:10:22.345206] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:13.430 passed 00:06:13.430 Test: mem map translation ...[2024-07-15 09:10:22.374503] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:13.430 [2024-07-15 09:10:22.374526] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:13.430 [2024-07-15 09:10:22.374582] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:13.430 [2024-07-15 09:10:22.374596] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:13.688 passed 00:06:13.688 Test: mem map registration ...[2024-07-15 09:10:22.432415] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:13.688 [2024-07-15 09:10:22.432440] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:13.688 passed 00:06:13.688 Test: mem map adjacent registrations ...passed 00:06:13.688 00:06:13.688 Run Summary: Type Total Ran Passed Failed Inactive 00:06:13.688 suites 1 1 n/a 0 0 00:06:13.688 tests 4 4 4 0 0 00:06:13.688 asserts 152 152 152 0 n/a 00:06:13.688 00:06:13.688 Elapsed time = 0.198 seconds 00:06:13.688 00:06:13.688 real 0m0.213s 00:06:13.688 user 0m0.199s 00:06:13.688 sys 0m0.013s 00:06:13.688 09:10:22 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:13.688 09:10:22 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:13.688 ************************************ 00:06:13.688 END TEST env_memory 00:06:13.688 ************************************ 00:06:13.688 09:10:22 env -- common/autotest_common.sh@1142 -- # return 0 00:06:13.688 09:10:22 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:13.688 09:10:22 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:13.688 09:10:22 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.688 09:10:22 env -- common/autotest_common.sh@10 -- # set +x 00:06:13.688 ************************************ 00:06:13.688 START TEST env_vtophys 00:06:13.688 ************************************ 00:06:13.688 09:10:22 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:13.688 EAL: lib.eal log level changed from notice to debug 00:06:13.688 EAL: Detected lcore 0 as core 0 on socket 0 00:06:13.688 EAL: Detected lcore 1 as core 1 on socket 0 00:06:13.688 EAL: Detected lcore 2 as core 2 on socket 0 00:06:13.688 EAL: Detected lcore 3 as core 3 on socket 0 00:06:13.688 EAL: Detected lcore 4 as core 4 on socket 0 00:06:13.688 EAL: Detected lcore 5 as core 8 on socket 0 00:06:13.688 EAL: Detected lcore 6 as core 9 on socket 0 00:06:13.689 EAL: Detected lcore 7 as core 10 on socket 0 00:06:13.689 EAL: Detected lcore 8 as core 11 on socket 0 00:06:13.689 EAL: Detected lcore 9 as core 16 on socket 0 00:06:13.689 EAL: Detected lcore 10 as core 17 on socket 0 00:06:13.689 EAL: Detected lcore 11 as core 18 on socket 0 00:06:13.689 EAL: Detected lcore 12 as core 19 on socket 0 00:06:13.689 EAL: Detected lcore 13 as core 20 on socket 0 00:06:13.689 EAL: Detected lcore 14 as core 24 on socket 0 00:06:13.689 EAL: Detected lcore 15 as core 25 on socket 0 00:06:13.689 EAL: Detected lcore 16 as core 26 on socket 0 00:06:13.689 EAL: Detected lcore 17 as core 27 on socket 0 00:06:13.689 EAL: Detected lcore 18 as core 0 on socket 1 00:06:13.689 EAL: Detected lcore 19 as core 1 on socket 1 00:06:13.689 EAL: Detected lcore 20 as core 2 on socket 1 00:06:13.689 EAL: Detected lcore 21 as core 3 on socket 1 00:06:13.689 EAL: Detected lcore 22 as core 4 on socket 1 00:06:13.689 EAL: Detected lcore 23 as core 8 on socket 1 00:06:13.689 EAL: Detected lcore 24 as core 9 on socket 1 00:06:13.689 EAL: Detected lcore 25 as core 10 on socket 1 00:06:13.689 EAL: Detected lcore 26 as core 11 on socket 1 00:06:13.689 EAL: Detected lcore 27 as core 16 on socket 1 00:06:13.689 EAL: Detected lcore 28 as core 17 on socket 1 00:06:13.689 EAL: Detected lcore 29 as core 18 on socket 1 00:06:13.689 EAL: Detected lcore 30 as core 19 on socket 1 00:06:13.689 EAL: Detected lcore 31 as core 20 on socket 1 00:06:13.689 EAL: Detected lcore 32 as core 24 on socket 1 00:06:13.689 EAL: Detected lcore 33 as core 25 on socket 1 00:06:13.689 EAL: Detected lcore 34 as core 26 on socket 1 00:06:13.689 EAL: Detected lcore 35 as core 27 on socket 1 00:06:13.689 EAL: Detected lcore 36 as core 0 on socket 0 00:06:13.689 EAL: Detected lcore 37 as core 1 on socket 0 00:06:13.689 EAL: Detected lcore 38 as core 2 on socket 0 00:06:13.689 EAL: Detected lcore 39 as core 3 on socket 0 00:06:13.689 EAL: Detected lcore 40 as core 4 on socket 0 00:06:13.689 EAL: Detected lcore 41 as core 8 on socket 0 00:06:13.689 EAL: Detected lcore 42 as core 9 on socket 0 00:06:13.689 EAL: Detected lcore 43 as core 10 on socket 0 00:06:13.689 EAL: Detected lcore 44 as core 11 on socket 0 00:06:13.689 EAL: Detected lcore 45 as core 16 on socket 0 00:06:13.689 EAL: Detected lcore 46 as core 17 on socket 0 00:06:13.689 EAL: Detected lcore 47 as core 18 on socket 0 00:06:13.689 EAL: Detected lcore 48 as core 19 on socket 0 00:06:13.689 EAL: Detected lcore 49 as core 20 on socket 0 00:06:13.689 EAL: Detected lcore 50 as core 24 on socket 0 00:06:13.689 EAL: Detected lcore 51 as core 25 on socket 0 00:06:13.689 EAL: Detected lcore 52 as core 26 on socket 0 00:06:13.689 EAL: Detected lcore 53 as core 27 on socket 0 00:06:13.689 EAL: Detected lcore 54 as core 0 on socket 1 00:06:13.689 EAL: Detected lcore 55 as core 1 on socket 1 00:06:13.689 EAL: Detected lcore 56 as core 2 on socket 1 00:06:13.689 EAL: Detected lcore 57 as core 3 on socket 1 00:06:13.689 EAL: Detected lcore 58 as core 4 on socket 1 00:06:13.689 EAL: Detected lcore 59 as core 8 on socket 1 00:06:13.689 EAL: Detected lcore 60 as core 9 on socket 1 00:06:13.689 EAL: Detected lcore 61 as core 10 on socket 1 00:06:13.689 EAL: Detected lcore 62 as core 11 on socket 1 00:06:13.689 EAL: Detected lcore 63 as core 16 on socket 1 00:06:13.689 EAL: Detected lcore 64 as core 17 on socket 1 00:06:13.689 EAL: Detected lcore 65 as core 18 on socket 1 00:06:13.689 EAL: Detected lcore 66 as core 19 on socket 1 00:06:13.689 EAL: Detected lcore 67 as core 20 on socket 1 00:06:13.689 EAL: Detected lcore 68 as core 24 on socket 1 00:06:13.689 EAL: Detected lcore 69 as core 25 on socket 1 00:06:13.689 EAL: Detected lcore 70 as core 26 on socket 1 00:06:13.689 EAL: Detected lcore 71 as core 27 on socket 1 00:06:13.689 EAL: Maximum logical cores by configuration: 128 00:06:13.689 EAL: Detected CPU lcores: 72 00:06:13.689 EAL: Detected NUMA nodes: 2 00:06:13.689 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:13.689 EAL: Detected shared linkage of DPDK 00:06:13.689 EAL: No shared files mode enabled, IPC will be disabled 00:06:13.948 EAL: No shared files mode enabled, IPC is disabled 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:13.948 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:13.948 EAL: Bus pci wants IOVA as 'PA' 00:06:13.948 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:13.948 EAL: Bus vdev wants IOVA as 'DC' 00:06:13.948 EAL: Selected IOVA mode 'PA' 00:06:13.948 EAL: Probing VFIO support... 00:06:13.948 EAL: IOMMU type 1 (Type 1) is supported 00:06:13.948 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:13.948 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:13.948 EAL: VFIO support initialized 00:06:13.948 EAL: Ask a virtual area of 0x2e000 bytes 00:06:13.948 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:13.948 EAL: Setting up physically contiguous memory... 00:06:13.948 EAL: Setting maximum number of open files to 524288 00:06:13.948 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:13.948 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:13.948 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:13.948 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.948 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:13.948 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:13.948 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.948 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:13.948 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:13.948 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.948 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:13.948 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:13.948 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.948 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:13.948 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:13.948 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.948 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:13.948 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:13.948 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.948 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:13.948 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:13.948 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.948 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:13.948 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:13.948 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.948 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:13.948 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:13.948 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:13.948 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.949 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:13.949 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:13.949 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.949 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:13.949 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:13.949 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.949 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:13.949 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:13.949 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.949 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:13.949 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:13.949 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.949 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:13.949 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:13.949 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.949 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:13.949 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:13.949 EAL: Ask a virtual area of 0x61000 bytes 00:06:13.949 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:13.949 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:13.949 EAL: Ask a virtual area of 0x400000000 bytes 00:06:13.949 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:13.949 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:13.949 EAL: Hugepages will be freed exactly as allocated. 00:06:13.949 EAL: No shared files mode enabled, IPC is disabled 00:06:13.949 EAL: No shared files mode enabled, IPC is disabled 00:06:13.949 EAL: TSC frequency is ~2300000 KHz 00:06:13.949 EAL: Main lcore 0 is ready (tid=7ff80cf58b00;cpuset=[0]) 00:06:13.949 EAL: Trying to obtain current memory policy. 00:06:13.949 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.949 EAL: Restoring previous memory policy: 0 00:06:13.949 EAL: request: mp_malloc_sync 00:06:13.949 EAL: No shared files mode enabled, IPC is disabled 00:06:13.949 EAL: Heap on socket 0 was expanded by 2MB 00:06:13.949 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001000000 00:06:13.949 EAL: PCI memory mapped at 0x202001001000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001002000 00:06:13.949 EAL: PCI memory mapped at 0x202001003000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001004000 00:06:13.949 EAL: PCI memory mapped at 0x202001005000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001006000 00:06:13.949 EAL: PCI memory mapped at 0x202001007000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001008000 00:06:13.949 EAL: PCI memory mapped at 0x202001009000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200100a000 00:06:13.949 EAL: PCI memory mapped at 0x20200100b000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200100c000 00:06:13.949 EAL: PCI memory mapped at 0x20200100d000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200100e000 00:06:13.949 EAL: PCI memory mapped at 0x20200100f000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001010000 00:06:13.949 EAL: PCI memory mapped at 0x202001011000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001012000 00:06:13.949 EAL: PCI memory mapped at 0x202001013000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001014000 00:06:13.949 EAL: PCI memory mapped at 0x202001015000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001016000 00:06:13.949 EAL: PCI memory mapped at 0x202001017000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001018000 00:06:13.949 EAL: PCI memory mapped at 0x202001019000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200101a000 00:06:13.949 EAL: PCI memory mapped at 0x20200101b000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200101c000 00:06:13.949 EAL: PCI memory mapped at 0x20200101d000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:13.949 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200101e000 00:06:13.949 EAL: PCI memory mapped at 0x20200101f000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001020000 00:06:13.949 EAL: PCI memory mapped at 0x202001021000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001022000 00:06:13.949 EAL: PCI memory mapped at 0x202001023000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001024000 00:06:13.949 EAL: PCI memory mapped at 0x202001025000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001026000 00:06:13.949 EAL: PCI memory mapped at 0x202001027000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001028000 00:06:13.949 EAL: PCI memory mapped at 0x202001029000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200102a000 00:06:13.949 EAL: PCI memory mapped at 0x20200102b000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200102c000 00:06:13.949 EAL: PCI memory mapped at 0x20200102d000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x20200102e000 00:06:13.949 EAL: PCI memory mapped at 0x20200102f000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001030000 00:06:13.949 EAL: PCI memory mapped at 0x202001031000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001032000 00:06:13.949 EAL: PCI memory mapped at 0x202001033000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001034000 00:06:13.949 EAL: PCI memory mapped at 0x202001035000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:13.949 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:13.949 EAL: probe driver: 8086:37c9 qat 00:06:13.949 EAL: PCI memory mapped at 0x202001036000 00:06:13.949 EAL: PCI memory mapped at 0x202001037000 00:06:13.949 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:13.950 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001038000 00:06:13.950 EAL: PCI memory mapped at 0x202001039000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:13.950 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200103a000 00:06:13.950 EAL: PCI memory mapped at 0x20200103b000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:13.950 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200103c000 00:06:13.950 EAL: PCI memory mapped at 0x20200103d000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:13.950 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200103e000 00:06:13.950 EAL: PCI memory mapped at 0x20200103f000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:13.950 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001040000 00:06:13.950 EAL: PCI memory mapped at 0x202001041000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:13.950 EAL: Trying to obtain current memory policy. 00:06:13.950 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:13.950 EAL: Restoring previous memory policy: 4 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 1 was expanded by 2MB 00:06:13.950 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001042000 00:06:13.950 EAL: PCI memory mapped at 0x202001043000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001044000 00:06:13.950 EAL: PCI memory mapped at 0x202001045000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001046000 00:06:13.950 EAL: PCI memory mapped at 0x202001047000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001048000 00:06:13.950 EAL: PCI memory mapped at 0x202001049000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200104a000 00:06:13.950 EAL: PCI memory mapped at 0x20200104b000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200104c000 00:06:13.950 EAL: PCI memory mapped at 0x20200104d000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200104e000 00:06:13.950 EAL: PCI memory mapped at 0x20200104f000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001050000 00:06:13.950 EAL: PCI memory mapped at 0x202001051000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001052000 00:06:13.950 EAL: PCI memory mapped at 0x202001053000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001054000 00:06:13.950 EAL: PCI memory mapped at 0x202001055000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001056000 00:06:13.950 EAL: PCI memory mapped at 0x202001057000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x202001058000 00:06:13.950 EAL: PCI memory mapped at 0x202001059000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200105a000 00:06:13.950 EAL: PCI memory mapped at 0x20200105b000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200105c000 00:06:13.950 EAL: PCI memory mapped at 0x20200105d000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:13.950 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:13.950 EAL: probe driver: 8086:37c9 qat 00:06:13.950 EAL: PCI memory mapped at 0x20200105e000 00:06:13.950 EAL: PCI memory mapped at 0x20200105f000 00:06:13.950 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:13.950 EAL: Mem event callback 'spdk:(nil)' registered 00:06:13.950 00:06:13.950 00:06:13.950 CUnit - A unit testing framework for C - Version 2.1-3 00:06:13.950 http://cunit.sourceforge.net/ 00:06:13.950 00:06:13.950 00:06:13.950 Suite: components_suite 00:06:13.950 Test: vtophys_malloc_test ...passed 00:06:13.950 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:13.950 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.950 EAL: Restoring previous memory policy: 4 00:06:13.950 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 0 was expanded by 4MB 00:06:13.950 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 0 was shrunk by 4MB 00:06:13.950 EAL: Trying to obtain current memory policy. 00:06:13.950 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.950 EAL: Restoring previous memory policy: 4 00:06:13.950 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 0 was expanded by 6MB 00:06:13.950 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 0 was shrunk by 6MB 00:06:13.950 EAL: Trying to obtain current memory policy. 00:06:13.950 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.950 EAL: Restoring previous memory policy: 4 00:06:13.950 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.950 EAL: request: mp_malloc_sync 00:06:13.950 EAL: No shared files mode enabled, IPC is disabled 00:06:13.950 EAL: Heap on socket 0 was expanded by 10MB 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was shrunk by 10MB 00:06:13.951 EAL: Trying to obtain current memory policy. 00:06:13.951 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.951 EAL: Restoring previous memory policy: 4 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was expanded by 18MB 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was shrunk by 18MB 00:06:13.951 EAL: Trying to obtain current memory policy. 00:06:13.951 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.951 EAL: Restoring previous memory policy: 4 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was expanded by 34MB 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was shrunk by 34MB 00:06:13.951 EAL: Trying to obtain current memory policy. 00:06:13.951 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.951 EAL: Restoring previous memory policy: 4 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was expanded by 66MB 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was shrunk by 66MB 00:06:13.951 EAL: Trying to obtain current memory policy. 00:06:13.951 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:13.951 EAL: Restoring previous memory policy: 4 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was expanded by 130MB 00:06:13.951 EAL: Calling mem event callback 'spdk:(nil)' 00:06:13.951 EAL: request: mp_malloc_sync 00:06:13.951 EAL: No shared files mode enabled, IPC is disabled 00:06:13.951 EAL: Heap on socket 0 was shrunk by 130MB 00:06:13.951 EAL: Trying to obtain current memory policy. 00:06:13.951 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:14.247 EAL: Restoring previous memory policy: 4 00:06:14.247 EAL: Calling mem event callback 'spdk:(nil)' 00:06:14.247 EAL: request: mp_malloc_sync 00:06:14.247 EAL: No shared files mode enabled, IPC is disabled 00:06:14.247 EAL: Heap on socket 0 was expanded by 258MB 00:06:14.247 EAL: Calling mem event callback 'spdk:(nil)' 00:06:14.247 EAL: request: mp_malloc_sync 00:06:14.247 EAL: No shared files mode enabled, IPC is disabled 00:06:14.247 EAL: Heap on socket 0 was shrunk by 258MB 00:06:14.247 EAL: Trying to obtain current memory policy. 00:06:14.247 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:14.247 EAL: Restoring previous memory policy: 4 00:06:14.247 EAL: Calling mem event callback 'spdk:(nil)' 00:06:14.247 EAL: request: mp_malloc_sync 00:06:14.247 EAL: No shared files mode enabled, IPC is disabled 00:06:14.247 EAL: Heap on socket 0 was expanded by 514MB 00:06:14.520 EAL: Calling mem event callback 'spdk:(nil)' 00:06:14.520 EAL: request: mp_malloc_sync 00:06:14.520 EAL: No shared files mode enabled, IPC is disabled 00:06:14.520 EAL: Heap on socket 0 was shrunk by 514MB 00:06:14.520 EAL: Trying to obtain current memory policy. 00:06:14.520 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:14.778 EAL: Restoring previous memory policy: 4 00:06:14.778 EAL: Calling mem event callback 'spdk:(nil)' 00:06:14.778 EAL: request: mp_malloc_sync 00:06:14.778 EAL: No shared files mode enabled, IPC is disabled 00:06:14.778 EAL: Heap on socket 0 was expanded by 1026MB 00:06:15.036 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.036 EAL: request: mp_malloc_sync 00:06:15.036 EAL: No shared files mode enabled, IPC is disabled 00:06:15.036 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:15.036 passed 00:06:15.036 00:06:15.036 Run Summary: Type Total Ran Passed Failed Inactive 00:06:15.036 suites 1 1 n/a 0 0 00:06:15.036 tests 2 2 2 0 0 00:06:15.036 asserts 5547 5547 5547 0 n/a 00:06:15.036 00:06:15.036 Elapsed time = 1.177 seconds 00:06:15.037 EAL: No shared files mode enabled, IPC is disabled 00:06:15.037 EAL: No shared files mode enabled, IPC is disabled 00:06:15.037 EAL: No shared files mode enabled, IPC is disabled 00:06:15.037 00:06:15.037 real 0m1.384s 00:06:15.037 user 0m0.764s 00:06:15.037 sys 0m0.582s 00:06:15.037 09:10:23 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:15.037 09:10:23 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:15.037 ************************************ 00:06:15.037 END TEST env_vtophys 00:06:15.037 ************************************ 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1142 -- # return 0 00:06:15.295 09:10:24 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.295 09:10:24 env -- common/autotest_common.sh@10 -- # set +x 00:06:15.295 ************************************ 00:06:15.295 START TEST env_pci 00:06:15.295 ************************************ 00:06:15.295 09:10:24 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:15.295 00:06:15.295 00:06:15.295 CUnit - A unit testing framework for C - Version 2.1-3 00:06:15.295 http://cunit.sourceforge.net/ 00:06:15.295 00:06:15.295 00:06:15.295 Suite: pci 00:06:15.295 Test: pci_hook ...[2024-07-15 09:10:24.089244] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 43103 has claimed it 00:06:15.295 EAL: Cannot find device (10000:00:01.0) 00:06:15.295 EAL: Failed to attach device on primary process 00:06:15.295 passed 00:06:15.295 00:06:15.295 Run Summary: Type Total Ran Passed Failed Inactive 00:06:15.295 suites 1 1 n/a 0 0 00:06:15.295 tests 1 1 1 0 0 00:06:15.295 asserts 25 25 25 0 n/a 00:06:15.295 00:06:15.295 Elapsed time = 0.043 seconds 00:06:15.295 00:06:15.295 real 0m0.069s 00:06:15.295 user 0m0.015s 00:06:15.295 sys 0m0.054s 00:06:15.295 09:10:24 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:15.295 09:10:24 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:15.295 ************************************ 00:06:15.295 END TEST env_pci 00:06:15.295 ************************************ 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1142 -- # return 0 00:06:15.295 09:10:24 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:15.295 09:10:24 env -- env/env.sh@15 -- # uname 00:06:15.295 09:10:24 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:15.295 09:10:24 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:15.295 09:10:24 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:15.295 09:10:24 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.295 09:10:24 env -- common/autotest_common.sh@10 -- # set +x 00:06:15.295 ************************************ 00:06:15.295 START TEST env_dpdk_post_init 00:06:15.295 ************************************ 00:06:15.295 09:10:24 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:15.553 EAL: Detected CPU lcores: 72 00:06:15.553 EAL: Detected NUMA nodes: 2 00:06:15.553 EAL: Detected shared linkage of DPDK 00:06:15.553 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:15.553 EAL: Selected IOVA mode 'PA' 00:06:15.553 EAL: VFIO support initialized 00:06:15.553 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.553 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.553 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.553 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:15.553 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:15.553 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:15.554 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.554 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:15.554 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:15.555 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:15.555 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:15.555 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:15.555 EAL: Using IOMMU type 1 (Type 1) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:15.555 EAL: Ignore mapping IO port bar(1) 00:06:15.555 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:15.813 EAL: Ignore mapping IO port bar(1) 00:06:15.813 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:15.813 EAL: Ignore mapping IO port bar(1) 00:06:15.813 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:15.813 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:15.813 EAL: Ignore mapping IO port bar(1) 00:06:15.813 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Ignore mapping IO port bar(5) 00:06:16.071 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:16.071 EAL: Ignore mapping IO port bar(1) 00:06:16.071 EAL: Ignore mapping IO port bar(5) 00:06:16.071 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:19.349 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:19.349 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:19.349 Starting DPDK initialization... 00:06:19.349 Starting SPDK post initialization... 00:06:19.349 SPDK NVMe probe 00:06:19.349 Attaching to 0000:5e:00.0 00:06:19.349 Attached to 0000:5e:00.0 00:06:19.349 Cleaning up... 00:06:19.349 00:06:19.349 real 0m3.511s 00:06:19.349 user 0m2.397s 00:06:19.349 sys 0m0.669s 00:06:19.349 09:10:27 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.349 09:10:27 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:19.349 ************************************ 00:06:19.349 END TEST env_dpdk_post_init 00:06:19.349 ************************************ 00:06:19.349 09:10:27 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.349 09:10:27 env -- env/env.sh@26 -- # uname 00:06:19.349 09:10:27 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:19.349 09:10:27 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.349 09:10:27 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.349 09:10:27 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.349 09:10:27 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.349 ************************************ 00:06:19.349 START TEST env_mem_callbacks 00:06:19.349 ************************************ 00:06:19.349 09:10:27 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.349 EAL: Detected CPU lcores: 72 00:06:19.349 EAL: Detected NUMA nodes: 2 00:06:19.349 EAL: Detected shared linkage of DPDK 00:06:19.349 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:19.349 EAL: Selected IOVA mode 'PA' 00:06:19.349 EAL: VFIO support initialized 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.349 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.349 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:19.349 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.350 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:19.350 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.350 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.351 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:19.351 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.351 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:19.351 00:06:19.351 00:06:19.351 CUnit - A unit testing framework for C - Version 2.1-3 00:06:19.351 http://cunit.sourceforge.net/ 00:06:19.351 00:06:19.351 00:06:19.351 Suite: memory 00:06:19.351 Test: test ... 00:06:19.351 register 0x200000200000 2097152 00:06:19.351 register 0x201000a00000 2097152 00:06:19.351 malloc 3145728 00:06:19.351 register 0x200000400000 4194304 00:06:19.351 buf 0x200000500000 len 3145728 PASSED 00:06:19.351 malloc 64 00:06:19.351 buf 0x2000004fff40 len 64 PASSED 00:06:19.351 malloc 4194304 00:06:19.351 register 0x200000800000 6291456 00:06:19.351 buf 0x200000a00000 len 4194304 PASSED 00:06:19.351 free 0x200000500000 3145728 00:06:19.351 free 0x2000004fff40 64 00:06:19.351 unregister 0x200000400000 4194304 PASSED 00:06:19.351 free 0x200000a00000 4194304 00:06:19.351 unregister 0x200000800000 6291456 PASSED 00:06:19.351 malloc 8388608 00:06:19.351 register 0x200000400000 10485760 00:06:19.351 buf 0x200000600000 len 8388608 PASSED 00:06:19.351 free 0x200000600000 8388608 00:06:19.351 unregister 0x200000400000 10485760 PASSED 00:06:19.351 passed 00:06:19.351 00:06:19.351 Run Summary: Type Total Ran Passed Failed Inactive 00:06:19.351 suites 1 1 n/a 0 0 00:06:19.351 tests 1 1 1 0 0 00:06:19.351 asserts 16 16 16 0 n/a 00:06:19.351 00:06:19.351 Elapsed time = 0.006 seconds 00:06:19.351 00:06:19.351 real 0m0.108s 00:06:19.351 user 0m0.035s 00:06:19.351 sys 0m0.072s 00:06:19.351 09:10:27 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.351 09:10:27 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:19.351 ************************************ 00:06:19.351 END TEST env_mem_callbacks 00:06:19.351 ************************************ 00:06:19.351 09:10:27 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.351 00:06:19.351 real 0m5.805s 00:06:19.351 user 0m3.589s 00:06:19.351 sys 0m1.775s 00:06:19.351 09:10:27 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.351 09:10:27 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.351 ************************************ 00:06:19.351 END TEST env 00:06:19.352 ************************************ 00:06:19.352 09:10:27 -- common/autotest_common.sh@1142 -- # return 0 00:06:19.352 09:10:27 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:19.352 09:10:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.352 09:10:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.352 09:10:27 -- common/autotest_common.sh@10 -- # set +x 00:06:19.352 ************************************ 00:06:19.352 START TEST rpc 00:06:19.352 ************************************ 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:19.352 * Looking for test storage... 00:06:19.352 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:19.352 09:10:28 rpc -- rpc/rpc.sh@65 -- # spdk_pid=43755 00:06:19.352 09:10:28 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:19.352 09:10:28 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:19.352 09:10:28 rpc -- rpc/rpc.sh@67 -- # waitforlisten 43755 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@829 -- # '[' -z 43755 ']' 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.352 09:10:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:19.352 [2024-07-15 09:10:28.211420] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:19.352 [2024-07-15 09:10:28.211491] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid43755 ] 00:06:19.610 [2024-07-15 09:10:28.339846] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.610 [2024-07-15 09:10:28.437083] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:19.610 [2024-07-15 09:10:28.437136] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 43755' to capture a snapshot of events at runtime. 00:06:19.610 [2024-07-15 09:10:28.437151] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:19.610 [2024-07-15 09:10:28.437164] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:19.610 [2024-07-15 09:10:28.437175] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid43755 for offline analysis/debug. 00:06:19.610 [2024-07-15 09:10:28.437211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.201 09:10:29 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.201 09:10:29 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:20.201 09:10:29 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.201 09:10:29 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.201 09:10:29 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:20.201 09:10:29 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:20.201 09:10:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.201 09:10:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.201 09:10:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 ************************************ 00:06:20.461 START TEST rpc_integrity 00:06:20.461 ************************************ 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:20.461 { 00:06:20.461 "name": "Malloc0", 00:06:20.461 "aliases": [ 00:06:20.461 "3d9af188-cc10-40d7-a596-de3ba98febbb" 00:06:20.461 ], 00:06:20.461 "product_name": "Malloc disk", 00:06:20.461 "block_size": 512, 00:06:20.461 "num_blocks": 16384, 00:06:20.461 "uuid": "3d9af188-cc10-40d7-a596-de3ba98febbb", 00:06:20.461 "assigned_rate_limits": { 00:06:20.461 "rw_ios_per_sec": 0, 00:06:20.461 "rw_mbytes_per_sec": 0, 00:06:20.461 "r_mbytes_per_sec": 0, 00:06:20.461 "w_mbytes_per_sec": 0 00:06:20.461 }, 00:06:20.461 "claimed": false, 00:06:20.461 "zoned": false, 00:06:20.461 "supported_io_types": { 00:06:20.461 "read": true, 00:06:20.461 "write": true, 00:06:20.461 "unmap": true, 00:06:20.461 "flush": true, 00:06:20.461 "reset": true, 00:06:20.461 "nvme_admin": false, 00:06:20.461 "nvme_io": false, 00:06:20.461 "nvme_io_md": false, 00:06:20.461 "write_zeroes": true, 00:06:20.461 "zcopy": true, 00:06:20.461 "get_zone_info": false, 00:06:20.461 "zone_management": false, 00:06:20.461 "zone_append": false, 00:06:20.461 "compare": false, 00:06:20.461 "compare_and_write": false, 00:06:20.461 "abort": true, 00:06:20.461 "seek_hole": false, 00:06:20.461 "seek_data": false, 00:06:20.461 "copy": true, 00:06:20.461 "nvme_iov_md": false 00:06:20.461 }, 00:06:20.461 "memory_domains": [ 00:06:20.461 { 00:06:20.461 "dma_device_id": "system", 00:06:20.461 "dma_device_type": 1 00:06:20.461 }, 00:06:20.461 { 00:06:20.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.461 "dma_device_type": 2 00:06:20.461 } 00:06:20.461 ], 00:06:20.461 "driver_specific": {} 00:06:20.461 } 00:06:20.461 ]' 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 [2024-07-15 09:10:29.293179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:20.461 [2024-07-15 09:10:29.293222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:20.461 [2024-07-15 09:10:29.293244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8deb0 00:06:20.461 [2024-07-15 09:10:29.293257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:20.461 [2024-07-15 09:10:29.294752] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:20.461 [2024-07-15 09:10:29.294780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:20.461 Passthru0 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.461 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.461 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:20.461 { 00:06:20.461 "name": "Malloc0", 00:06:20.461 "aliases": [ 00:06:20.461 "3d9af188-cc10-40d7-a596-de3ba98febbb" 00:06:20.461 ], 00:06:20.461 "product_name": "Malloc disk", 00:06:20.461 "block_size": 512, 00:06:20.461 "num_blocks": 16384, 00:06:20.461 "uuid": "3d9af188-cc10-40d7-a596-de3ba98febbb", 00:06:20.461 "assigned_rate_limits": { 00:06:20.461 "rw_ios_per_sec": 0, 00:06:20.461 "rw_mbytes_per_sec": 0, 00:06:20.461 "r_mbytes_per_sec": 0, 00:06:20.461 "w_mbytes_per_sec": 0 00:06:20.461 }, 00:06:20.461 "claimed": true, 00:06:20.461 "claim_type": "exclusive_write", 00:06:20.461 "zoned": false, 00:06:20.461 "supported_io_types": { 00:06:20.461 "read": true, 00:06:20.461 "write": true, 00:06:20.461 "unmap": true, 00:06:20.461 "flush": true, 00:06:20.461 "reset": true, 00:06:20.461 "nvme_admin": false, 00:06:20.461 "nvme_io": false, 00:06:20.461 "nvme_io_md": false, 00:06:20.461 "write_zeroes": true, 00:06:20.461 "zcopy": true, 00:06:20.461 "get_zone_info": false, 00:06:20.461 "zone_management": false, 00:06:20.461 "zone_append": false, 00:06:20.461 "compare": false, 00:06:20.461 "compare_and_write": false, 00:06:20.461 "abort": true, 00:06:20.461 "seek_hole": false, 00:06:20.461 "seek_data": false, 00:06:20.461 "copy": true, 00:06:20.461 "nvme_iov_md": false 00:06:20.461 }, 00:06:20.461 "memory_domains": [ 00:06:20.461 { 00:06:20.461 "dma_device_id": "system", 00:06:20.461 "dma_device_type": 1 00:06:20.461 }, 00:06:20.461 { 00:06:20.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.461 "dma_device_type": 2 00:06:20.461 } 00:06:20.461 ], 00:06:20.461 "driver_specific": {} 00:06:20.461 }, 00:06:20.461 { 00:06:20.461 "name": "Passthru0", 00:06:20.461 "aliases": [ 00:06:20.461 "d0e80765-2708-521f-8134-6c99eec2faf0" 00:06:20.461 ], 00:06:20.461 "product_name": "passthru", 00:06:20.461 "block_size": 512, 00:06:20.461 "num_blocks": 16384, 00:06:20.461 "uuid": "d0e80765-2708-521f-8134-6c99eec2faf0", 00:06:20.461 "assigned_rate_limits": { 00:06:20.461 "rw_ios_per_sec": 0, 00:06:20.461 "rw_mbytes_per_sec": 0, 00:06:20.461 "r_mbytes_per_sec": 0, 00:06:20.461 "w_mbytes_per_sec": 0 00:06:20.461 }, 00:06:20.461 "claimed": false, 00:06:20.461 "zoned": false, 00:06:20.461 "supported_io_types": { 00:06:20.461 "read": true, 00:06:20.461 "write": true, 00:06:20.461 "unmap": true, 00:06:20.461 "flush": true, 00:06:20.461 "reset": true, 00:06:20.461 "nvme_admin": false, 00:06:20.461 "nvme_io": false, 00:06:20.461 "nvme_io_md": false, 00:06:20.461 "write_zeroes": true, 00:06:20.461 "zcopy": true, 00:06:20.461 "get_zone_info": false, 00:06:20.461 "zone_management": false, 00:06:20.461 "zone_append": false, 00:06:20.461 "compare": false, 00:06:20.461 "compare_and_write": false, 00:06:20.461 "abort": true, 00:06:20.461 "seek_hole": false, 00:06:20.461 "seek_data": false, 00:06:20.461 "copy": true, 00:06:20.461 "nvme_iov_md": false 00:06:20.461 }, 00:06:20.461 "memory_domains": [ 00:06:20.461 { 00:06:20.461 "dma_device_id": "system", 00:06:20.461 "dma_device_type": 1 00:06:20.461 }, 00:06:20.461 { 00:06:20.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.461 "dma_device_type": 2 00:06:20.461 } 00:06:20.461 ], 00:06:20.462 "driver_specific": { 00:06:20.462 "passthru": { 00:06:20.462 "name": "Passthru0", 00:06:20.462 "base_bdev_name": "Malloc0" 00:06:20.462 } 00:06:20.462 } 00:06:20.462 } 00:06:20.462 ]' 00:06:20.462 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:20.462 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:20.462 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.462 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.462 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.462 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.721 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:20.721 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:20.721 09:10:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:20.721 00:06:20.721 real 0m0.290s 00:06:20.721 user 0m0.185s 00:06:20.721 sys 0m0.045s 00:06:20.721 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:20.721 09:10:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 ************************************ 00:06:20.721 END TEST rpc_integrity 00:06:20.721 ************************************ 00:06:20.721 09:10:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:20.721 09:10:29 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:20.721 09:10:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.721 09:10:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.721 09:10:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 ************************************ 00:06:20.721 START TEST rpc_plugins 00:06:20.721 ************************************ 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:20.721 { 00:06:20.721 "name": "Malloc1", 00:06:20.721 "aliases": [ 00:06:20.721 "ccf39d23-383b-4c0c-859a-4a24acfe43af" 00:06:20.721 ], 00:06:20.721 "product_name": "Malloc disk", 00:06:20.721 "block_size": 4096, 00:06:20.721 "num_blocks": 256, 00:06:20.721 "uuid": "ccf39d23-383b-4c0c-859a-4a24acfe43af", 00:06:20.721 "assigned_rate_limits": { 00:06:20.721 "rw_ios_per_sec": 0, 00:06:20.721 "rw_mbytes_per_sec": 0, 00:06:20.721 "r_mbytes_per_sec": 0, 00:06:20.721 "w_mbytes_per_sec": 0 00:06:20.721 }, 00:06:20.721 "claimed": false, 00:06:20.721 "zoned": false, 00:06:20.721 "supported_io_types": { 00:06:20.721 "read": true, 00:06:20.721 "write": true, 00:06:20.721 "unmap": true, 00:06:20.721 "flush": true, 00:06:20.721 "reset": true, 00:06:20.721 "nvme_admin": false, 00:06:20.721 "nvme_io": false, 00:06:20.721 "nvme_io_md": false, 00:06:20.721 "write_zeroes": true, 00:06:20.721 "zcopy": true, 00:06:20.721 "get_zone_info": false, 00:06:20.721 "zone_management": false, 00:06:20.721 "zone_append": false, 00:06:20.721 "compare": false, 00:06:20.721 "compare_and_write": false, 00:06:20.721 "abort": true, 00:06:20.721 "seek_hole": false, 00:06:20.721 "seek_data": false, 00:06:20.721 "copy": true, 00:06:20.721 "nvme_iov_md": false 00:06:20.721 }, 00:06:20.721 "memory_domains": [ 00:06:20.721 { 00:06:20.721 "dma_device_id": "system", 00:06:20.721 "dma_device_type": 1 00:06:20.721 }, 00:06:20.721 { 00:06:20.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.721 "dma_device_type": 2 00:06:20.721 } 00:06:20.721 ], 00:06:20.721 "driver_specific": {} 00:06:20.721 } 00:06:20.721 ]' 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:20.721 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:20.721 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:20.980 09:10:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:20.980 00:06:20.980 real 0m0.140s 00:06:20.980 user 0m0.093s 00:06:20.980 sys 0m0.016s 00:06:20.980 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:20.980 09:10:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:20.980 ************************************ 00:06:20.980 END TEST rpc_plugins 00:06:20.980 ************************************ 00:06:20.980 09:10:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:20.980 09:10:29 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:20.980 09:10:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.980 09:10:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.980 09:10:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.980 ************************************ 00:06:20.980 START TEST rpc_trace_cmd_test 00:06:20.980 ************************************ 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:20.980 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid43755", 00:06:20.980 "tpoint_group_mask": "0x8", 00:06:20.980 "iscsi_conn": { 00:06:20.980 "mask": "0x2", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "scsi": { 00:06:20.980 "mask": "0x4", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "bdev": { 00:06:20.980 "mask": "0x8", 00:06:20.980 "tpoint_mask": "0xffffffffffffffff" 00:06:20.980 }, 00:06:20.980 "nvmf_rdma": { 00:06:20.980 "mask": "0x10", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "nvmf_tcp": { 00:06:20.980 "mask": "0x20", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "ftl": { 00:06:20.980 "mask": "0x40", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "blobfs": { 00:06:20.980 "mask": "0x80", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "dsa": { 00:06:20.980 "mask": "0x200", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "thread": { 00:06:20.980 "mask": "0x400", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "nvme_pcie": { 00:06:20.980 "mask": "0x800", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "iaa": { 00:06:20.980 "mask": "0x1000", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "nvme_tcp": { 00:06:20.980 "mask": "0x2000", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "bdev_nvme": { 00:06:20.980 "mask": "0x4000", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 }, 00:06:20.980 "sock": { 00:06:20.980 "mask": "0x8000", 00:06:20.980 "tpoint_mask": "0x0" 00:06:20.980 } 00:06:20.980 }' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:20.980 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:21.239 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:21.239 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:21.239 09:10:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:21.239 00:06:21.239 real 0m0.217s 00:06:21.239 user 0m0.178s 00:06:21.239 sys 0m0.033s 00:06:21.239 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.239 09:10:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:21.239 ************************************ 00:06:21.239 END TEST rpc_trace_cmd_test 00:06:21.239 ************************************ 00:06:21.239 09:10:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.239 09:10:30 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:21.239 09:10:30 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:21.239 09:10:30 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:21.239 09:10:30 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.239 09:10:30 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.239 09:10:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.239 ************************************ 00:06:21.239 START TEST rpc_daemon_integrity 00:06:21.239 ************************************ 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.239 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:21.239 { 00:06:21.239 "name": "Malloc2", 00:06:21.239 "aliases": [ 00:06:21.239 "9b1722da-fe85-4c83-aea4-b28188b122aa" 00:06:21.239 ], 00:06:21.239 "product_name": "Malloc disk", 00:06:21.239 "block_size": 512, 00:06:21.239 "num_blocks": 16384, 00:06:21.239 "uuid": "9b1722da-fe85-4c83-aea4-b28188b122aa", 00:06:21.239 "assigned_rate_limits": { 00:06:21.239 "rw_ios_per_sec": 0, 00:06:21.239 "rw_mbytes_per_sec": 0, 00:06:21.240 "r_mbytes_per_sec": 0, 00:06:21.240 "w_mbytes_per_sec": 0 00:06:21.240 }, 00:06:21.240 "claimed": false, 00:06:21.240 "zoned": false, 00:06:21.240 "supported_io_types": { 00:06:21.240 "read": true, 00:06:21.240 "write": true, 00:06:21.240 "unmap": true, 00:06:21.240 "flush": true, 00:06:21.240 "reset": true, 00:06:21.240 "nvme_admin": false, 00:06:21.240 "nvme_io": false, 00:06:21.240 "nvme_io_md": false, 00:06:21.240 "write_zeroes": true, 00:06:21.240 "zcopy": true, 00:06:21.240 "get_zone_info": false, 00:06:21.240 "zone_management": false, 00:06:21.240 "zone_append": false, 00:06:21.240 "compare": false, 00:06:21.240 "compare_and_write": false, 00:06:21.240 "abort": true, 00:06:21.240 "seek_hole": false, 00:06:21.240 "seek_data": false, 00:06:21.240 "copy": true, 00:06:21.240 "nvme_iov_md": false 00:06:21.240 }, 00:06:21.240 "memory_domains": [ 00:06:21.240 { 00:06:21.240 "dma_device_id": "system", 00:06:21.240 "dma_device_type": 1 00:06:21.240 }, 00:06:21.240 { 00:06:21.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.240 "dma_device_type": 2 00:06:21.240 } 00:06:21.240 ], 00:06:21.240 "driver_specific": {} 00:06:21.240 } 00:06:21.240 ]' 00:06:21.240 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 [2024-07-15 09:10:30.199793] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:21.499 [2024-07-15 09:10:30.199842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:21.499 [2024-07-15 09:10:30.199869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8eb20 00:06:21.499 [2024-07-15 09:10:30.199882] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:21.499 [2024-07-15 09:10:30.201327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:21.499 [2024-07-15 09:10:30.201357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:21.499 Passthru0 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:21.499 { 00:06:21.499 "name": "Malloc2", 00:06:21.499 "aliases": [ 00:06:21.499 "9b1722da-fe85-4c83-aea4-b28188b122aa" 00:06:21.499 ], 00:06:21.499 "product_name": "Malloc disk", 00:06:21.499 "block_size": 512, 00:06:21.499 "num_blocks": 16384, 00:06:21.499 "uuid": "9b1722da-fe85-4c83-aea4-b28188b122aa", 00:06:21.499 "assigned_rate_limits": { 00:06:21.499 "rw_ios_per_sec": 0, 00:06:21.499 "rw_mbytes_per_sec": 0, 00:06:21.499 "r_mbytes_per_sec": 0, 00:06:21.499 "w_mbytes_per_sec": 0 00:06:21.499 }, 00:06:21.499 "claimed": true, 00:06:21.499 "claim_type": "exclusive_write", 00:06:21.499 "zoned": false, 00:06:21.499 "supported_io_types": { 00:06:21.499 "read": true, 00:06:21.499 "write": true, 00:06:21.499 "unmap": true, 00:06:21.499 "flush": true, 00:06:21.499 "reset": true, 00:06:21.499 "nvme_admin": false, 00:06:21.499 "nvme_io": false, 00:06:21.499 "nvme_io_md": false, 00:06:21.499 "write_zeroes": true, 00:06:21.499 "zcopy": true, 00:06:21.499 "get_zone_info": false, 00:06:21.499 "zone_management": false, 00:06:21.499 "zone_append": false, 00:06:21.499 "compare": false, 00:06:21.499 "compare_and_write": false, 00:06:21.499 "abort": true, 00:06:21.499 "seek_hole": false, 00:06:21.499 "seek_data": false, 00:06:21.499 "copy": true, 00:06:21.499 "nvme_iov_md": false 00:06:21.499 }, 00:06:21.499 "memory_domains": [ 00:06:21.499 { 00:06:21.499 "dma_device_id": "system", 00:06:21.499 "dma_device_type": 1 00:06:21.499 }, 00:06:21.499 { 00:06:21.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.499 "dma_device_type": 2 00:06:21.499 } 00:06:21.499 ], 00:06:21.499 "driver_specific": {} 00:06:21.499 }, 00:06:21.499 { 00:06:21.499 "name": "Passthru0", 00:06:21.499 "aliases": [ 00:06:21.499 "b24abb65-bbf0-5943-aa36-49c0ef19afa7" 00:06:21.499 ], 00:06:21.499 "product_name": "passthru", 00:06:21.499 "block_size": 512, 00:06:21.499 "num_blocks": 16384, 00:06:21.499 "uuid": "b24abb65-bbf0-5943-aa36-49c0ef19afa7", 00:06:21.499 "assigned_rate_limits": { 00:06:21.499 "rw_ios_per_sec": 0, 00:06:21.499 "rw_mbytes_per_sec": 0, 00:06:21.499 "r_mbytes_per_sec": 0, 00:06:21.499 "w_mbytes_per_sec": 0 00:06:21.499 }, 00:06:21.499 "claimed": false, 00:06:21.499 "zoned": false, 00:06:21.499 "supported_io_types": { 00:06:21.499 "read": true, 00:06:21.499 "write": true, 00:06:21.499 "unmap": true, 00:06:21.499 "flush": true, 00:06:21.499 "reset": true, 00:06:21.499 "nvme_admin": false, 00:06:21.499 "nvme_io": false, 00:06:21.499 "nvme_io_md": false, 00:06:21.499 "write_zeroes": true, 00:06:21.499 "zcopy": true, 00:06:21.499 "get_zone_info": false, 00:06:21.499 "zone_management": false, 00:06:21.499 "zone_append": false, 00:06:21.499 "compare": false, 00:06:21.499 "compare_and_write": false, 00:06:21.499 "abort": true, 00:06:21.499 "seek_hole": false, 00:06:21.499 "seek_data": false, 00:06:21.499 "copy": true, 00:06:21.499 "nvme_iov_md": false 00:06:21.499 }, 00:06:21.499 "memory_domains": [ 00:06:21.499 { 00:06:21.499 "dma_device_id": "system", 00:06:21.499 "dma_device_type": 1 00:06:21.499 }, 00:06:21.499 { 00:06:21.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.499 "dma_device_type": 2 00:06:21.499 } 00:06:21.499 ], 00:06:21.499 "driver_specific": { 00:06:21.499 "passthru": { 00:06:21.499 "name": "Passthru0", 00:06:21.499 "base_bdev_name": "Malloc2" 00:06:21.499 } 00:06:21.499 } 00:06:21.499 } 00:06:21.499 ]' 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:21.499 00:06:21.499 real 0m0.304s 00:06:21.499 user 0m0.178s 00:06:21.499 sys 0m0.057s 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.499 09:10:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.499 ************************************ 00:06:21.499 END TEST rpc_daemon_integrity 00:06:21.499 ************************************ 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.499 09:10:30 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:21.499 09:10:30 rpc -- rpc/rpc.sh@84 -- # killprocess 43755 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@948 -- # '[' -z 43755 ']' 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@952 -- # kill -0 43755 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@953 -- # uname 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 43755 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 43755' 00:06:21.499 killing process with pid 43755 00:06:21.499 09:10:30 rpc -- common/autotest_common.sh@967 -- # kill 43755 00:06:21.500 09:10:30 rpc -- common/autotest_common.sh@972 -- # wait 43755 00:06:22.067 00:06:22.067 real 0m2.801s 00:06:22.067 user 0m3.527s 00:06:22.067 sys 0m0.900s 00:06:22.067 09:10:30 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.067 09:10:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.067 ************************************ 00:06:22.067 END TEST rpc 00:06:22.067 ************************************ 00:06:22.067 09:10:30 -- common/autotest_common.sh@1142 -- # return 0 00:06:22.067 09:10:30 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:22.067 09:10:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.067 09:10:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.067 09:10:30 -- common/autotest_common.sh@10 -- # set +x 00:06:22.067 ************************************ 00:06:22.067 START TEST skip_rpc 00:06:22.067 ************************************ 00:06:22.067 09:10:30 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:22.067 * Looking for test storage... 00:06:22.067 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:22.325 09:10:31 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:22.325 09:10:31 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:22.325 09:10:31 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:22.325 09:10:31 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.325 09:10:31 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.325 09:10:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.325 ************************************ 00:06:22.325 START TEST skip_rpc 00:06:22.325 ************************************ 00:06:22.325 09:10:31 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:22.325 09:10:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:22.325 09:10:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=44285 00:06:22.325 09:10:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:22.325 09:10:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:22.325 [2024-07-15 09:10:31.122684] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:22.325 [2024-07-15 09:10:31.122746] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid44285 ] 00:06:22.326 [2024-07-15 09:10:31.252141] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.584 [2024-07-15 09:10:31.352994] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 44285 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 44285 ']' 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 44285 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 44285 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 44285' 00:06:27.856 killing process with pid 44285 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 44285 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 44285 00:06:27.856 00:06:27.856 real 0m5.443s 00:06:27.856 user 0m5.112s 00:06:27.856 sys 0m0.353s 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.856 09:10:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.856 ************************************ 00:06:27.856 END TEST skip_rpc 00:06:27.856 ************************************ 00:06:27.856 09:10:36 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:27.856 09:10:36 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:27.856 09:10:36 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:27.856 09:10:36 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.857 09:10:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.857 ************************************ 00:06:27.857 START TEST skip_rpc_with_json 00:06:27.857 ************************************ 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=45021 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 45021 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 45021 ']' 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.857 09:10:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:27.857 [2024-07-15 09:10:36.643104] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:27.857 [2024-07-15 09:10:36.643158] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid45021 ] 00:06:27.857 [2024-07-15 09:10:36.758708] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.116 [2024-07-15 09:10:36.858882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.683 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.683 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:28.683 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:28.683 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.683 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.683 [2024-07-15 09:10:37.583624] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:28.683 request: 00:06:28.683 { 00:06:28.683 "trtype": "tcp", 00:06:28.683 "method": "nvmf_get_transports", 00:06:28.683 "req_id": 1 00:06:28.683 } 00:06:28.683 Got JSON-RPC error response 00:06:28.683 response: 00:06:28.683 { 00:06:28.683 "code": -19, 00:06:28.683 "message": "No such device" 00:06:28.683 } 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.684 [2024-07-15 09:10:37.591755] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.684 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.943 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.943 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:28.943 { 00:06:28.943 "subsystems": [ 00:06:28.943 { 00:06:28.943 "subsystem": "keyring", 00:06:28.943 "config": [] 00:06:28.943 }, 00:06:28.943 { 00:06:28.943 "subsystem": "iobuf", 00:06:28.943 "config": [ 00:06:28.943 { 00:06:28.943 "method": "iobuf_set_options", 00:06:28.943 "params": { 00:06:28.943 "small_pool_count": 8192, 00:06:28.943 "large_pool_count": 1024, 00:06:28.943 "small_bufsize": 8192, 00:06:28.943 "large_bufsize": 135168 00:06:28.943 } 00:06:28.943 } 00:06:28.943 ] 00:06:28.943 }, 00:06:28.943 { 00:06:28.944 "subsystem": "sock", 00:06:28.944 "config": [ 00:06:28.944 { 00:06:28.944 "method": "sock_set_default_impl", 00:06:28.944 "params": { 00:06:28.944 "impl_name": "posix" 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "sock_impl_set_options", 00:06:28.944 "params": { 00:06:28.944 "impl_name": "ssl", 00:06:28.944 "recv_buf_size": 4096, 00:06:28.944 "send_buf_size": 4096, 00:06:28.944 "enable_recv_pipe": true, 00:06:28.944 "enable_quickack": false, 00:06:28.944 "enable_placement_id": 0, 00:06:28.944 "enable_zerocopy_send_server": true, 00:06:28.944 "enable_zerocopy_send_client": false, 00:06:28.944 "zerocopy_threshold": 0, 00:06:28.944 "tls_version": 0, 00:06:28.944 "enable_ktls": false 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "sock_impl_set_options", 00:06:28.944 "params": { 00:06:28.944 "impl_name": "posix", 00:06:28.944 "recv_buf_size": 2097152, 00:06:28.944 "send_buf_size": 2097152, 00:06:28.944 "enable_recv_pipe": true, 00:06:28.944 "enable_quickack": false, 00:06:28.944 "enable_placement_id": 0, 00:06:28.944 "enable_zerocopy_send_server": true, 00:06:28.944 "enable_zerocopy_send_client": false, 00:06:28.944 "zerocopy_threshold": 0, 00:06:28.944 "tls_version": 0, 00:06:28.944 "enable_ktls": false 00:06:28.944 } 00:06:28.944 } 00:06:28.944 ] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "vmd", 00:06:28.944 "config": [] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "accel", 00:06:28.944 "config": [ 00:06:28.944 { 00:06:28.944 "method": "accel_set_options", 00:06:28.944 "params": { 00:06:28.944 "small_cache_size": 128, 00:06:28.944 "large_cache_size": 16, 00:06:28.944 "task_count": 2048, 00:06:28.944 "sequence_count": 2048, 00:06:28.944 "buf_count": 2048 00:06:28.944 } 00:06:28.944 } 00:06:28.944 ] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "bdev", 00:06:28.944 "config": [ 00:06:28.944 { 00:06:28.944 "method": "bdev_set_options", 00:06:28.944 "params": { 00:06:28.944 "bdev_io_pool_size": 65535, 00:06:28.944 "bdev_io_cache_size": 256, 00:06:28.944 "bdev_auto_examine": true, 00:06:28.944 "iobuf_small_cache_size": 128, 00:06:28.944 "iobuf_large_cache_size": 16 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "bdev_raid_set_options", 00:06:28.944 "params": { 00:06:28.944 "process_window_size_kb": 1024 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "bdev_iscsi_set_options", 00:06:28.944 "params": { 00:06:28.944 "timeout_sec": 30 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "bdev_nvme_set_options", 00:06:28.944 "params": { 00:06:28.944 "action_on_timeout": "none", 00:06:28.944 "timeout_us": 0, 00:06:28.944 "timeout_admin_us": 0, 00:06:28.944 "keep_alive_timeout_ms": 10000, 00:06:28.944 "arbitration_burst": 0, 00:06:28.944 "low_priority_weight": 0, 00:06:28.944 "medium_priority_weight": 0, 00:06:28.944 "high_priority_weight": 0, 00:06:28.944 "nvme_adminq_poll_period_us": 10000, 00:06:28.944 "nvme_ioq_poll_period_us": 0, 00:06:28.944 "io_queue_requests": 0, 00:06:28.944 "delay_cmd_submit": true, 00:06:28.944 "transport_retry_count": 4, 00:06:28.944 "bdev_retry_count": 3, 00:06:28.944 "transport_ack_timeout": 0, 00:06:28.944 "ctrlr_loss_timeout_sec": 0, 00:06:28.944 "reconnect_delay_sec": 0, 00:06:28.944 "fast_io_fail_timeout_sec": 0, 00:06:28.944 "disable_auto_failback": false, 00:06:28.944 "generate_uuids": false, 00:06:28.944 "transport_tos": 0, 00:06:28.944 "nvme_error_stat": false, 00:06:28.944 "rdma_srq_size": 0, 00:06:28.944 "io_path_stat": false, 00:06:28.944 "allow_accel_sequence": false, 00:06:28.944 "rdma_max_cq_size": 0, 00:06:28.944 "rdma_cm_event_timeout_ms": 0, 00:06:28.944 "dhchap_digests": [ 00:06:28.944 "sha256", 00:06:28.944 "sha384", 00:06:28.944 "sha512" 00:06:28.944 ], 00:06:28.944 "dhchap_dhgroups": [ 00:06:28.944 "null", 00:06:28.944 "ffdhe2048", 00:06:28.944 "ffdhe3072", 00:06:28.944 "ffdhe4096", 00:06:28.944 "ffdhe6144", 00:06:28.944 "ffdhe8192" 00:06:28.944 ] 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "bdev_nvme_set_hotplug", 00:06:28.944 "params": { 00:06:28.944 "period_us": 100000, 00:06:28.944 "enable": false 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "bdev_wait_for_examine" 00:06:28.944 } 00:06:28.944 ] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "scsi", 00:06:28.944 "config": null 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "scheduler", 00:06:28.944 "config": [ 00:06:28.944 { 00:06:28.944 "method": "framework_set_scheduler", 00:06:28.944 "params": { 00:06:28.944 "name": "static" 00:06:28.944 } 00:06:28.944 } 00:06:28.944 ] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "vhost_scsi", 00:06:28.944 "config": [] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "vhost_blk", 00:06:28.944 "config": [] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "ublk", 00:06:28.944 "config": [] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "nbd", 00:06:28.944 "config": [] 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "subsystem": "nvmf", 00:06:28.944 "config": [ 00:06:28.944 { 00:06:28.944 "method": "nvmf_set_config", 00:06:28.944 "params": { 00:06:28.944 "discovery_filter": "match_any", 00:06:28.944 "admin_cmd_passthru": { 00:06:28.944 "identify_ctrlr": false 00:06:28.944 } 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "nvmf_set_max_subsystems", 00:06:28.944 "params": { 00:06:28.944 "max_subsystems": 1024 00:06:28.944 } 00:06:28.944 }, 00:06:28.944 { 00:06:28.944 "method": "nvmf_set_crdt", 00:06:28.944 "params": { 00:06:28.945 "crdt1": 0, 00:06:28.945 "crdt2": 0, 00:06:28.945 "crdt3": 0 00:06:28.945 } 00:06:28.945 }, 00:06:28.945 { 00:06:28.945 "method": "nvmf_create_transport", 00:06:28.945 "params": { 00:06:28.945 "trtype": "TCP", 00:06:28.945 "max_queue_depth": 128, 00:06:28.945 "max_io_qpairs_per_ctrlr": 127, 00:06:28.945 "in_capsule_data_size": 4096, 00:06:28.945 "max_io_size": 131072, 00:06:28.945 "io_unit_size": 131072, 00:06:28.945 "max_aq_depth": 128, 00:06:28.945 "num_shared_buffers": 511, 00:06:28.945 "buf_cache_size": 4294967295, 00:06:28.945 "dif_insert_or_strip": false, 00:06:28.945 "zcopy": false, 00:06:28.945 "c2h_success": true, 00:06:28.945 "sock_priority": 0, 00:06:28.945 "abort_timeout_sec": 1, 00:06:28.945 "ack_timeout": 0, 00:06:28.945 "data_wr_pool_size": 0 00:06:28.945 } 00:06:28.945 } 00:06:28.945 ] 00:06:28.945 }, 00:06:28.945 { 00:06:28.945 "subsystem": "iscsi", 00:06:28.945 "config": [ 00:06:28.945 { 00:06:28.945 "method": "iscsi_set_options", 00:06:28.945 "params": { 00:06:28.945 "node_base": "iqn.2016-06.io.spdk", 00:06:28.945 "max_sessions": 128, 00:06:28.945 "max_connections_per_session": 2, 00:06:28.945 "max_queue_depth": 64, 00:06:28.945 "default_time2wait": 2, 00:06:28.945 "default_time2retain": 20, 00:06:28.945 "first_burst_length": 8192, 00:06:28.945 "immediate_data": true, 00:06:28.945 "allow_duplicated_isid": false, 00:06:28.945 "error_recovery_level": 0, 00:06:28.945 "nop_timeout": 60, 00:06:28.945 "nop_in_interval": 30, 00:06:28.945 "disable_chap": false, 00:06:28.945 "require_chap": false, 00:06:28.945 "mutual_chap": false, 00:06:28.945 "chap_group": 0, 00:06:28.945 "max_large_datain_per_connection": 64, 00:06:28.945 "max_r2t_per_connection": 4, 00:06:28.945 "pdu_pool_size": 36864, 00:06:28.945 "immediate_data_pool_size": 16384, 00:06:28.945 "data_out_pool_size": 2048 00:06:28.945 } 00:06:28.945 } 00:06:28.945 ] 00:06:28.945 } 00:06:28.945 ] 00:06:28.945 } 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 45021 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 45021 ']' 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 45021 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 45021 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 45021' 00:06:28.945 killing process with pid 45021 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 45021 00:06:28.945 09:10:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 45021 00:06:29.514 09:10:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=45203 00:06:29.514 09:10:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:29.514 09:10:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:34.778 09:10:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 45203 00:06:34.778 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 45203 ']' 00:06:34.778 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 45203 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 45203 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 45203' 00:06:34.779 killing process with pid 45203 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 45203 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 45203 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:34.779 00:06:34.779 real 0m7.036s 00:06:34.779 user 0m6.711s 00:06:34.779 sys 0m0.837s 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:34.779 ************************************ 00:06:34.779 END TEST skip_rpc_with_json 00:06:34.779 ************************************ 00:06:34.779 09:10:43 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.779 09:10:43 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:34.779 09:10:43 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.779 09:10:43 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.779 09:10:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.779 ************************************ 00:06:34.779 START TEST skip_rpc_with_delay 00:06:34.779 ************************************ 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:34.779 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.037 [2024-07-15 09:10:43.769565] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:35.037 [2024-07-15 09:10:43.769656] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:35.037 00:06:35.037 real 0m0.087s 00:06:35.037 user 0m0.047s 00:06:35.037 sys 0m0.038s 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.037 09:10:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:35.037 ************************************ 00:06:35.037 END TEST skip_rpc_with_delay 00:06:35.037 ************************************ 00:06:35.037 09:10:43 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:35.037 09:10:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:35.037 09:10:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:35.037 09:10:43 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:35.037 09:10:43 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.037 09:10:43 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.037 09:10:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.037 ************************************ 00:06:35.037 START TEST exit_on_failed_rpc_init 00:06:35.037 ************************************ 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=45965 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 45965 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 45965 ']' 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.037 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:35.037 09:10:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:35.037 [2024-07-15 09:10:43.936102] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:35.037 [2024-07-15 09:10:43.936168] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid45965 ] 00:06:35.332 [2024-07-15 09:10:44.065358] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.332 [2024-07-15 09:10:44.171667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:35.917 09:10:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:36.176 [2024-07-15 09:10:44.903950] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:36.176 [2024-07-15 09:10:44.904019] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid46140 ] 00:06:36.176 [2024-07-15 09:10:45.022981] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.176 [2024-07-15 09:10:45.119248] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:36.176 [2024-07-15 09:10:45.119331] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:36.176 [2024-07-15 09:10:45.119348] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:36.176 [2024-07-15 09:10:45.119360] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:36.435 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 45965 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 45965 ']' 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 45965 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 45965 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 45965' 00:06:36.436 killing process with pid 45965 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 45965 00:06:36.436 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 45965 00:06:37.003 00:06:37.003 real 0m1.792s 00:06:37.003 user 0m2.068s 00:06:37.003 sys 0m0.581s 00:06:37.003 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.003 09:10:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:37.003 ************************************ 00:06:37.003 END TEST exit_on_failed_rpc_init 00:06:37.003 ************************************ 00:06:37.003 09:10:45 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:37.003 09:10:45 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:37.003 00:06:37.003 real 0m14.778s 00:06:37.003 user 0m14.074s 00:06:37.003 sys 0m2.126s 00:06:37.003 09:10:45 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.003 09:10:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:37.003 ************************************ 00:06:37.003 END TEST skip_rpc 00:06:37.003 ************************************ 00:06:37.003 09:10:45 -- common/autotest_common.sh@1142 -- # return 0 00:06:37.003 09:10:45 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:37.003 09:10:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:37.003 09:10:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.003 09:10:45 -- common/autotest_common.sh@10 -- # set +x 00:06:37.003 ************************************ 00:06:37.003 START TEST rpc_client 00:06:37.003 ************************************ 00:06:37.003 09:10:45 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:37.003 * Looking for test storage... 00:06:37.003 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:37.003 09:10:45 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:37.003 OK 00:06:37.003 09:10:45 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:37.003 00:06:37.003 real 0m0.128s 00:06:37.003 user 0m0.056s 00:06:37.003 sys 0m0.081s 00:06:37.004 09:10:45 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.004 09:10:45 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:37.004 ************************************ 00:06:37.004 END TEST rpc_client 00:06:37.004 ************************************ 00:06:37.004 09:10:45 -- common/autotest_common.sh@1142 -- # return 0 00:06:37.004 09:10:45 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:37.004 09:10:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:37.004 09:10:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.004 09:10:45 -- common/autotest_common.sh@10 -- # set +x 00:06:37.262 ************************************ 00:06:37.262 START TEST json_config 00:06:37.262 ************************************ 00:06:37.262 09:10:45 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:37.262 09:10:46 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:37.262 09:10:46 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:37.263 09:10:46 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:37.263 09:10:46 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:37.263 09:10:46 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:37.263 09:10:46 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.263 09:10:46 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.263 09:10:46 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.263 09:10:46 json_config -- paths/export.sh@5 -- # export PATH 00:06:37.263 09:10:46 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@47 -- # : 0 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:37.263 09:10:46 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:37.263 INFO: JSON configuration test init 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.263 09:10:46 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:37.263 09:10:46 json_config -- json_config/common.sh@9 -- # local app=target 00:06:37.263 09:10:46 json_config -- json_config/common.sh@10 -- # shift 00:06:37.263 09:10:46 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:37.263 09:10:46 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:37.263 09:10:46 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:37.263 09:10:46 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:37.263 09:10:46 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:37.263 09:10:46 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=46426 00:06:37.263 09:10:46 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:37.263 Waiting for target to run... 00:06:37.263 09:10:46 json_config -- json_config/common.sh@25 -- # waitforlisten 46426 /var/tmp/spdk_tgt.sock 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@829 -- # '[' -z 46426 ']' 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:37.263 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:37.263 09:10:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.263 09:10:46 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:37.263 [2024-07-15 09:10:46.142956] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:37.263 [2024-07-15 09:10:46.143032] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid46426 ] 00:06:37.859 [2024-07-15 09:10:46.504602] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.859 [2024-07-15 09:10:46.595505] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.118 09:10:47 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:38.118 09:10:47 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:38.118 09:10:47 json_config -- json_config/common.sh@26 -- # echo '' 00:06:38.118 00:06:38.118 09:10:47 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:38.118 09:10:47 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:38.119 09:10:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:38.119 09:10:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.119 09:10:47 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:38.119 09:10:47 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:38.119 09:10:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:38.687 09:10:47 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:38.687 09:10:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:38.947 [2024-07-15 09:10:47.782978] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:38.947 09:10:47 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:38.947 09:10:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:39.516 [2024-07-15 09:10:48.284273] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:39.516 09:10:48 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:39.516 09:10:48 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:39.516 09:10:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:39.516 09:10:48 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:39.516 09:10:48 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:39.516 09:10:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:39.776 [2024-07-15 09:10:48.593731] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:42.312 09:10:51 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:42.312 09:10:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:42.312 09:10:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:42.312 09:10:51 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:42.572 09:10:51 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:42.572 09:10:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:42.572 09:10:51 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:42.572 09:10:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:42.572 09:10:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:42.572 09:10:51 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:42.832 09:10:51 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:42.832 09:10:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:43.091 Nvme0n1p0 Nvme0n1p1 00:06:43.091 09:10:51 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:43.091 09:10:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:43.351 [2024-07-15 09:10:52.174474] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:43.351 [2024-07-15 09:10:52.174527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:43.351 00:06:43.351 09:10:52 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:43.351 09:10:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:43.610 Malloc3 00:06:43.610 09:10:52 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:43.610 09:10:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:43.870 [2024-07-15 09:10:52.663873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:43.870 [2024-07-15 09:10:52.663921] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:43.870 [2024-07-15 09:10:52.663950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2487a00 00:06:43.870 [2024-07-15 09:10:52.663963] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:43.870 [2024-07-15 09:10:52.665544] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:43.870 [2024-07-15 09:10:52.665574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:43.870 PTBdevFromMalloc3 00:06:43.870 09:10:52 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:43.870 09:10:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:44.129 Null0 00:06:44.129 09:10:52 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:44.129 09:10:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:44.388 Malloc0 00:06:44.388 09:10:53 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:44.388 09:10:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:44.645 Malloc1 00:06:44.645 09:10:53 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:44.645 09:10:53 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:44.902 102400+0 records in 00:06:44.902 102400+0 records out 00:06:44.902 104857600 bytes (105 MB, 100 MiB) copied, 0.313976 s, 334 MB/s 00:06:44.902 09:10:53 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:44.902 09:10:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:45.160 aio_disk 00:06:45.160 09:10:53 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:45.160 09:10:53 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:45.160 09:10:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:50.425 bd448a92-92bb-4315-baac-37d5d5c5ed27 00:06:50.425 09:10:58 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:50.425 09:10:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:50.425 09:10:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:50.425 09:10:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:50.425 09:10:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:50.425 09:10:59 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:50.425 09:10:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:50.683 09:10:59 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.683 09:10:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.940 09:10:59 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:50.941 09:10:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.941 09:10:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.941 MallocForCryptoBdev 00:06:50.941 09:10:59 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:50.941 09:10:59 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:51.199 09:10:59 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:51.199 09:10:59 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:51.199 09:10:59 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:51.199 09:10:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:51.457 [2024-07-15 09:11:00.163895] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:51.457 CryptoMallocBdev 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:286d39e7-c456-4c19-91b4-96d792656092 bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:286d39e7-c456-4c19-91b4-96d792656092 bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@71 -- # sort 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@72 -- # sort 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:51.457 09:11:00 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:51.457 09:11:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:286d39e7-c456-4c19-91b4-96d792656092 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.716 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:286d39e7-c456-4c19-91b4-96d792656092 bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\8\6\d\3\9\e\7\-\c\4\5\6\-\4\c\1\9\-\9\1\b\4\-\9\6\d\7\9\2\6\5\6\0\9\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\4\c\7\7\9\c\4\-\3\7\3\a\-\4\6\9\4\-\b\f\d\f\-\3\0\b\2\3\e\4\9\8\5\d\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\7\3\3\f\5\1\c\-\9\5\e\1\-\4\7\9\0\-\8\f\7\4\-\7\6\e\b\2\5\b\8\2\b\a\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\f\1\8\2\c\9\0\-\8\a\5\e\-\4\a\6\1\-\9\3\d\e\-\0\7\e\1\b\4\9\3\6\2\9\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@86 -- # cat 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:286d39e7-c456-4c19-91b4-96d792656092 bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:51.717 Expected events matched: 00:06:51.717 bdev_register:286d39e7-c456-4c19-91b4-96d792656092 00:06:51.717 bdev_register:44c779c4-373a-4694-bfdf-30b23e4985da 00:06:51.717 bdev_register:9733f51c-95e1-4790-8f74-76eb25b82ba8 00:06:51.717 bdev_register:af182c90-8a5e-4a61-93de-07e1b493629b 00:06:51.717 bdev_register:aio_disk 00:06:51.717 bdev_register:CryptoMallocBdev 00:06:51.717 bdev_register:Malloc0 00:06:51.717 bdev_register:Malloc0p0 00:06:51.717 bdev_register:Malloc0p1 00:06:51.717 bdev_register:Malloc0p2 00:06:51.717 bdev_register:Malloc1 00:06:51.717 bdev_register:Malloc3 00:06:51.717 bdev_register:MallocForCryptoBdev 00:06:51.717 bdev_register:Null0 00:06:51.717 bdev_register:Nvme0n1 00:06:51.717 bdev_register:Nvme0n1p0 00:06:51.717 bdev_register:Nvme0n1p1 00:06:51.717 bdev_register:PTBdevFromMalloc3 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:51.717 09:11:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:51.717 09:11:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:51.717 09:11:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:51.717 09:11:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:51.717 09:11:00 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.717 09:11:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.976 MallocBdevForConfigChangeCheck 00:06:51.976 09:11:00 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:51.976 09:11:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:51.976 09:11:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.976 09:11:00 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:51.976 09:11:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:52.235 09:11:01 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:52.235 INFO: shutting down applications... 00:06:52.235 09:11:01 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:52.235 09:11:01 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:52.235 09:11:01 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:52.236 09:11:01 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:52.494 [2024-07-15 09:11:01.283459] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:55.887 Calling clear_iscsi_subsystem 00:06:55.887 Calling clear_nvmf_subsystem 00:06:55.887 Calling clear_nbd_subsystem 00:06:55.887 Calling clear_ublk_subsystem 00:06:55.887 Calling clear_vhost_blk_subsystem 00:06:55.887 Calling clear_vhost_scsi_subsystem 00:06:55.887 Calling clear_bdev_subsystem 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@345 -- # break 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:55.888 09:11:04 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:55.888 09:11:04 json_config -- json_config/common.sh@31 -- # local app=target 00:06:55.888 09:11:04 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:55.888 09:11:04 json_config -- json_config/common.sh@35 -- # [[ -n 46426 ]] 00:06:55.888 09:11:04 json_config -- json_config/common.sh@38 -- # kill -SIGINT 46426 00:06:55.888 09:11:04 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:55.888 09:11:04 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:55.888 09:11:04 json_config -- json_config/common.sh@41 -- # kill -0 46426 00:06:55.888 09:11:04 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:56.147 09:11:05 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:56.147 09:11:05 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:56.147 09:11:05 json_config -- json_config/common.sh@41 -- # kill -0 46426 00:06:56.147 09:11:05 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:56.147 09:11:05 json_config -- json_config/common.sh@43 -- # break 00:06:56.147 09:11:05 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:56.147 09:11:05 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:56.147 SPDK target shutdown done 00:06:56.147 09:11:05 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:56.147 INFO: relaunching applications... 00:06:56.147 09:11:05 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:56.147 09:11:05 json_config -- json_config/common.sh@9 -- # local app=target 00:06:56.147 09:11:05 json_config -- json_config/common.sh@10 -- # shift 00:06:56.147 09:11:05 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:56.147 09:11:05 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:56.147 09:11:05 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:56.147 09:11:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:56.147 09:11:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:56.147 09:11:05 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=49046 00:06:56.147 09:11:05 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:56.147 Waiting for target to run... 00:06:56.147 09:11:05 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:56.147 09:11:05 json_config -- json_config/common.sh@25 -- # waitforlisten 49046 /var/tmp/spdk_tgt.sock 00:06:56.147 09:11:05 json_config -- common/autotest_common.sh@829 -- # '[' -z 49046 ']' 00:06:56.147 09:11:05 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:56.147 09:11:05 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:56.147 09:11:05 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:56.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:56.148 09:11:05 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:56.148 09:11:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.407 [2024-07-15 09:11:05.124553] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:06:56.407 [2024-07-15 09:11:05.124607] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid49046 ] 00:06:56.976 [2024-07-15 09:11:05.672305] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.976 [2024-07-15 09:11:05.775499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.976 [2024-07-15 09:11:05.829617] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:56.976 [2024-07-15 09:11:05.837653] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:56.976 [2024-07-15 09:11:05.845670] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:56.976 [2024-07-15 09:11:05.926951] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:59.509 [2024-07-15 09:11:08.134795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:59.509 [2024-07-15 09:11:08.134866] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:59.509 [2024-07-15 09:11:08.134882] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:59.509 [2024-07-15 09:11:08.142815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:59.509 [2024-07-15 09:11:08.142843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:59.509 [2024-07-15 09:11:08.150823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:59.509 [2024-07-15 09:11:08.150847] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:59.509 [2024-07-15 09:11:08.158859] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:59.509 [2024-07-15 09:11:08.158889] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:59.509 [2024-07-15 09:11:08.158902] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:59.767 [2024-07-15 09:11:08.535130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:59.767 [2024-07-15 09:11:08.535178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:59.767 [2024-07-15 09:11:08.535196] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2332b90 00:06:59.767 [2024-07-15 09:11:08.535209] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:59.767 [2024-07-15 09:11:08.535496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:59.767 [2024-07-15 09:11:08.535515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:59.767 09:11:08 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:59.767 09:11:08 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:59.767 09:11:08 json_config -- json_config/common.sh@26 -- # echo '' 00:06:59.767 00:06:59.767 09:11:08 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:59.767 09:11:08 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:59.767 INFO: Checking if target configuration is the same... 00:06:59.767 09:11:08 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:59.767 09:11:08 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:59.767 09:11:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:59.767 + '[' 2 -ne 2 ']' 00:06:59.767 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:59.767 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:59.767 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:59.767 +++ basename /dev/fd/62 00:06:59.767 ++ mktemp /tmp/62.XXX 00:06:59.767 + tmp_file_1=/tmp/62.Cro 00:06:59.767 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:59.767 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:59.767 + tmp_file_2=/tmp/spdk_tgt_config.json.8M3 00:06:59.767 + ret=0 00:06:59.767 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:00.333 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:00.333 + diff -u /tmp/62.Cro /tmp/spdk_tgt_config.json.8M3 00:07:00.333 + echo 'INFO: JSON config files are the same' 00:07:00.333 INFO: JSON config files are the same 00:07:00.333 + rm /tmp/62.Cro /tmp/spdk_tgt_config.json.8M3 00:07:00.333 + exit 0 00:07:00.333 09:11:09 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:00.333 09:11:09 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:00.333 INFO: changing configuration and checking if this can be detected... 00:07:00.333 09:11:09 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:00.333 09:11:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:00.333 09:11:09 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:00.333 09:11:09 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:00.333 09:11:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:00.591 + '[' 2 -ne 2 ']' 00:07:00.591 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:00.591 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:00.591 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:00.591 +++ basename /dev/fd/62 00:07:00.591 ++ mktemp /tmp/62.XXX 00:07:00.591 + tmp_file_1=/tmp/62.qvS 00:07:00.591 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:00.591 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:00.591 + tmp_file_2=/tmp/spdk_tgt_config.json.cBu 00:07:00.591 + ret=0 00:07:00.591 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:00.850 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:00.850 + diff -u /tmp/62.qvS /tmp/spdk_tgt_config.json.cBu 00:07:00.850 + ret=1 00:07:00.850 + echo '=== Start of file: /tmp/62.qvS ===' 00:07:00.850 + cat /tmp/62.qvS 00:07:00.850 + echo '=== End of file: /tmp/62.qvS ===' 00:07:00.850 + echo '' 00:07:00.850 + echo '=== Start of file: /tmp/spdk_tgt_config.json.cBu ===' 00:07:00.850 + cat /tmp/spdk_tgt_config.json.cBu 00:07:00.850 + echo '=== End of file: /tmp/spdk_tgt_config.json.cBu ===' 00:07:00.850 + echo '' 00:07:00.850 + rm /tmp/62.qvS /tmp/spdk_tgt_config.json.cBu 00:07:00.850 + exit 1 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:00.850 INFO: configuration change detected. 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:00.850 09:11:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:00.850 09:11:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@317 -- # [[ -n 49046 ]] 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:00.850 09:11:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:00.850 09:11:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:00.850 09:11:09 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:00.850 09:11:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:01.108 09:11:09 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:01.108 09:11:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:01.108 09:11:10 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:01.108 09:11:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:01.366 09:11:10 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:01.366 09:11:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:01.624 09:11:10 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:01.624 09:11:10 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:01.624 09:11:10 json_config -- json_config/json_config.sh@323 -- # killprocess 49046 00:07:01.624 09:11:10 json_config -- common/autotest_common.sh@948 -- # '[' -z 49046 ']' 00:07:01.624 09:11:10 json_config -- common/autotest_common.sh@952 -- # kill -0 49046 00:07:01.624 09:11:10 json_config -- common/autotest_common.sh@953 -- # uname 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 49046 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 49046' 00:07:01.625 killing process with pid 49046 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@967 -- # kill 49046 00:07:01.625 09:11:10 json_config -- common/autotest_common.sh@972 -- # wait 49046 00:07:04.920 09:11:13 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:04.920 09:11:13 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:04.920 09:11:13 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:04.920 09:11:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:04.920 09:11:13 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:04.920 09:11:13 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:04.920 INFO: Success 00:07:04.920 00:07:04.920 real 0m27.830s 00:07:04.920 user 0m33.667s 00:07:04.920 sys 0m3.776s 00:07:04.920 09:11:13 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.920 09:11:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:04.920 ************************************ 00:07:04.920 END TEST json_config 00:07:04.920 ************************************ 00:07:04.920 09:11:13 -- common/autotest_common.sh@1142 -- # return 0 00:07:04.920 09:11:13 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:04.920 09:11:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:04.920 09:11:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.920 09:11:13 -- common/autotest_common.sh@10 -- # set +x 00:07:05.179 ************************************ 00:07:05.179 START TEST json_config_extra_key 00:07:05.179 ************************************ 00:07:05.179 09:11:13 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:05.179 09:11:13 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:05.179 09:11:13 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:05.179 09:11:13 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:05.179 09:11:13 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:05.179 09:11:13 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.179 09:11:13 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.179 09:11:13 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.179 09:11:13 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:05.179 09:11:13 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:05.179 09:11:13 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:05.179 INFO: launching applications... 00:07:05.179 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=50381 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:05.179 Waiting for target to run... 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 50381 /var/tmp/spdk_tgt.sock 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 50381 ']' 00:07:05.179 09:11:14 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:05.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:05.179 09:11:14 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:05.179 [2024-07-15 09:11:14.081010] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:05.179 [2024-07-15 09:11:14.081085] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid50381 ] 00:07:05.745 [2024-07-15 09:11:14.674288] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.004 [2024-07-15 09:11:14.785562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.004 09:11:14 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:06.004 09:11:14 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:06.004 00:07:06.004 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:06.004 INFO: shutting down applications... 00:07:06.004 09:11:14 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 50381 ]] 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 50381 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 50381 00:07:06.004 09:11:14 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 50381 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:06.571 09:11:15 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:06.571 SPDK target shutdown done 00:07:06.571 09:11:15 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:06.571 Success 00:07:06.571 00:07:06.571 real 0m1.552s 00:07:06.571 user 0m0.933s 00:07:06.571 sys 0m0.728s 00:07:06.571 09:11:15 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.571 09:11:15 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:06.571 ************************************ 00:07:06.571 END TEST json_config_extra_key 00:07:06.571 ************************************ 00:07:06.571 09:11:15 -- common/autotest_common.sh@1142 -- # return 0 00:07:06.571 09:11:15 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:06.571 09:11:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:06.571 09:11:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.571 09:11:15 -- common/autotest_common.sh@10 -- # set +x 00:07:06.571 ************************************ 00:07:06.571 START TEST alias_rpc 00:07:06.571 ************************************ 00:07:06.571 09:11:15 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:06.830 * Looking for test storage... 00:07:06.830 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:06.830 09:11:15 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:06.830 09:11:15 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=50613 00:07:06.830 09:11:15 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 50613 00:07:06.830 09:11:15 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 50613 ']' 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:06.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:06.830 09:11:15 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.830 [2024-07-15 09:11:15.687959] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:06.830 [2024-07-15 09:11:15.688033] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid50613 ] 00:07:07.089 [2024-07-15 09:11:15.819010] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.089 [2024-07-15 09:11:15.915924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.654 09:11:16 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:07.911 09:11:16 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:07.911 09:11:16 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:08.169 09:11:16 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 50613 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 50613 ']' 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 50613 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 50613 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 50613' 00:07:08.169 killing process with pid 50613 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@967 -- # kill 50613 00:07:08.169 09:11:16 alias_rpc -- common/autotest_common.sh@972 -- # wait 50613 00:07:08.427 00:07:08.427 real 0m1.795s 00:07:08.427 user 0m1.985s 00:07:08.427 sys 0m0.556s 00:07:08.427 09:11:17 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.427 09:11:17 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.427 ************************************ 00:07:08.427 END TEST alias_rpc 00:07:08.427 ************************************ 00:07:08.427 09:11:17 -- common/autotest_common.sh@1142 -- # return 0 00:07:08.427 09:11:17 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:08.427 09:11:17 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:08.427 09:11:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:08.427 09:11:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.427 09:11:17 -- common/autotest_common.sh@10 -- # set +x 00:07:08.686 ************************************ 00:07:08.686 START TEST spdkcli_tcp 00:07:08.687 ************************************ 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:08.687 * Looking for test storage... 00:07:08.687 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=50853 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 50853 00:07:08.687 09:11:17 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 50853 ']' 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.687 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:08.687 09:11:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:08.687 [2024-07-15 09:11:17.587713] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:08.687 [2024-07-15 09:11:17.587793] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid50853 ] 00:07:08.945 [2024-07-15 09:11:17.717906] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:08.945 [2024-07-15 09:11:17.816379] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.945 [2024-07-15 09:11:17.816384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.878 09:11:18 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:09.878 09:11:18 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:09.878 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:09.878 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=51029 00:07:09.878 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:09.878 [ 00:07:09.878 "bdev_malloc_delete", 00:07:09.878 "bdev_malloc_create", 00:07:09.878 "bdev_null_resize", 00:07:09.878 "bdev_null_delete", 00:07:09.878 "bdev_null_create", 00:07:09.878 "bdev_nvme_cuse_unregister", 00:07:09.878 "bdev_nvme_cuse_register", 00:07:09.878 "bdev_opal_new_user", 00:07:09.878 "bdev_opal_set_lock_state", 00:07:09.878 "bdev_opal_delete", 00:07:09.878 "bdev_opal_get_info", 00:07:09.878 "bdev_opal_create", 00:07:09.878 "bdev_nvme_opal_revert", 00:07:09.878 "bdev_nvme_opal_init", 00:07:09.878 "bdev_nvme_send_cmd", 00:07:09.878 "bdev_nvme_get_path_iostat", 00:07:09.878 "bdev_nvme_get_mdns_discovery_info", 00:07:09.878 "bdev_nvme_stop_mdns_discovery", 00:07:09.878 "bdev_nvme_start_mdns_discovery", 00:07:09.878 "bdev_nvme_set_multipath_policy", 00:07:09.878 "bdev_nvme_set_preferred_path", 00:07:09.878 "bdev_nvme_get_io_paths", 00:07:09.878 "bdev_nvme_remove_error_injection", 00:07:09.878 "bdev_nvme_add_error_injection", 00:07:09.878 "bdev_nvme_get_discovery_info", 00:07:09.878 "bdev_nvme_stop_discovery", 00:07:09.878 "bdev_nvme_start_discovery", 00:07:09.878 "bdev_nvme_get_controller_health_info", 00:07:09.878 "bdev_nvme_disable_controller", 00:07:09.878 "bdev_nvme_enable_controller", 00:07:09.878 "bdev_nvme_reset_controller", 00:07:09.878 "bdev_nvme_get_transport_statistics", 00:07:09.878 "bdev_nvme_apply_firmware", 00:07:09.878 "bdev_nvme_detach_controller", 00:07:09.878 "bdev_nvme_get_controllers", 00:07:09.878 "bdev_nvme_attach_controller", 00:07:09.878 "bdev_nvme_set_hotplug", 00:07:09.878 "bdev_nvme_set_options", 00:07:09.878 "bdev_passthru_delete", 00:07:09.878 "bdev_passthru_create", 00:07:09.878 "bdev_lvol_set_parent_bdev", 00:07:09.878 "bdev_lvol_set_parent", 00:07:09.878 "bdev_lvol_check_shallow_copy", 00:07:09.878 "bdev_lvol_start_shallow_copy", 00:07:09.878 "bdev_lvol_grow_lvstore", 00:07:09.878 "bdev_lvol_get_lvols", 00:07:09.878 "bdev_lvol_get_lvstores", 00:07:09.878 "bdev_lvol_delete", 00:07:09.878 "bdev_lvol_set_read_only", 00:07:09.878 "bdev_lvol_resize", 00:07:09.878 "bdev_lvol_decouple_parent", 00:07:09.878 "bdev_lvol_inflate", 00:07:09.878 "bdev_lvol_rename", 00:07:09.878 "bdev_lvol_clone_bdev", 00:07:09.878 "bdev_lvol_clone", 00:07:09.878 "bdev_lvol_snapshot", 00:07:09.878 "bdev_lvol_create", 00:07:09.878 "bdev_lvol_delete_lvstore", 00:07:09.878 "bdev_lvol_rename_lvstore", 00:07:09.878 "bdev_lvol_create_lvstore", 00:07:09.878 "bdev_raid_set_options", 00:07:09.878 "bdev_raid_remove_base_bdev", 00:07:09.878 "bdev_raid_add_base_bdev", 00:07:09.878 "bdev_raid_delete", 00:07:09.879 "bdev_raid_create", 00:07:09.879 "bdev_raid_get_bdevs", 00:07:09.879 "bdev_error_inject_error", 00:07:09.879 "bdev_error_delete", 00:07:09.879 "bdev_error_create", 00:07:09.879 "bdev_split_delete", 00:07:09.879 "bdev_split_create", 00:07:09.879 "bdev_delay_delete", 00:07:09.879 "bdev_delay_create", 00:07:09.879 "bdev_delay_update_latency", 00:07:09.879 "bdev_zone_block_delete", 00:07:09.879 "bdev_zone_block_create", 00:07:09.879 "blobfs_create", 00:07:09.879 "blobfs_detect", 00:07:09.879 "blobfs_set_cache_size", 00:07:09.879 "bdev_crypto_delete", 00:07:09.879 "bdev_crypto_create", 00:07:09.879 "bdev_compress_delete", 00:07:09.879 "bdev_compress_create", 00:07:09.879 "bdev_compress_get_orphans", 00:07:09.879 "bdev_aio_delete", 00:07:09.879 "bdev_aio_rescan", 00:07:09.879 "bdev_aio_create", 00:07:09.879 "bdev_ftl_set_property", 00:07:09.879 "bdev_ftl_get_properties", 00:07:09.879 "bdev_ftl_get_stats", 00:07:09.879 "bdev_ftl_unmap", 00:07:09.879 "bdev_ftl_unload", 00:07:09.879 "bdev_ftl_delete", 00:07:09.879 "bdev_ftl_load", 00:07:09.879 "bdev_ftl_create", 00:07:09.879 "bdev_virtio_attach_controller", 00:07:09.879 "bdev_virtio_scsi_get_devices", 00:07:09.879 "bdev_virtio_detach_controller", 00:07:09.879 "bdev_virtio_blk_set_hotplug", 00:07:09.879 "bdev_iscsi_delete", 00:07:09.879 "bdev_iscsi_create", 00:07:09.879 "bdev_iscsi_set_options", 00:07:09.879 "accel_error_inject_error", 00:07:09.879 "ioat_scan_accel_module", 00:07:09.879 "dsa_scan_accel_module", 00:07:09.879 "iaa_scan_accel_module", 00:07:09.879 "dpdk_cryptodev_get_driver", 00:07:09.879 "dpdk_cryptodev_set_driver", 00:07:09.879 "dpdk_cryptodev_scan_accel_module", 00:07:09.879 "compressdev_scan_accel_module", 00:07:09.879 "keyring_file_remove_key", 00:07:09.879 "keyring_file_add_key", 00:07:09.879 "keyring_linux_set_options", 00:07:09.879 "iscsi_get_histogram", 00:07:09.879 "iscsi_enable_histogram", 00:07:09.879 "iscsi_set_options", 00:07:09.879 "iscsi_get_auth_groups", 00:07:09.879 "iscsi_auth_group_remove_secret", 00:07:09.879 "iscsi_auth_group_add_secret", 00:07:09.879 "iscsi_delete_auth_group", 00:07:09.879 "iscsi_create_auth_group", 00:07:09.879 "iscsi_set_discovery_auth", 00:07:09.879 "iscsi_get_options", 00:07:09.879 "iscsi_target_node_request_logout", 00:07:09.879 "iscsi_target_node_set_redirect", 00:07:09.879 "iscsi_target_node_set_auth", 00:07:09.879 "iscsi_target_node_add_lun", 00:07:09.879 "iscsi_get_stats", 00:07:09.879 "iscsi_get_connections", 00:07:09.879 "iscsi_portal_group_set_auth", 00:07:09.879 "iscsi_start_portal_group", 00:07:09.879 "iscsi_delete_portal_group", 00:07:09.879 "iscsi_create_portal_group", 00:07:09.879 "iscsi_get_portal_groups", 00:07:09.879 "iscsi_delete_target_node", 00:07:09.879 "iscsi_target_node_remove_pg_ig_maps", 00:07:09.879 "iscsi_target_node_add_pg_ig_maps", 00:07:09.879 "iscsi_create_target_node", 00:07:09.879 "iscsi_get_target_nodes", 00:07:09.879 "iscsi_delete_initiator_group", 00:07:09.879 "iscsi_initiator_group_remove_initiators", 00:07:09.879 "iscsi_initiator_group_add_initiators", 00:07:09.879 "iscsi_create_initiator_group", 00:07:09.879 "iscsi_get_initiator_groups", 00:07:09.879 "nvmf_set_crdt", 00:07:09.879 "nvmf_set_config", 00:07:09.879 "nvmf_set_max_subsystems", 00:07:09.879 "nvmf_stop_mdns_prr", 00:07:09.879 "nvmf_publish_mdns_prr", 00:07:09.879 "nvmf_subsystem_get_listeners", 00:07:09.879 "nvmf_subsystem_get_qpairs", 00:07:09.879 "nvmf_subsystem_get_controllers", 00:07:09.879 "nvmf_get_stats", 00:07:09.879 "nvmf_get_transports", 00:07:09.879 "nvmf_create_transport", 00:07:09.879 "nvmf_get_targets", 00:07:09.879 "nvmf_delete_target", 00:07:09.879 "nvmf_create_target", 00:07:09.879 "nvmf_subsystem_allow_any_host", 00:07:09.879 "nvmf_subsystem_remove_host", 00:07:09.879 "nvmf_subsystem_add_host", 00:07:09.879 "nvmf_ns_remove_host", 00:07:09.879 "nvmf_ns_add_host", 00:07:09.879 "nvmf_subsystem_remove_ns", 00:07:09.879 "nvmf_subsystem_add_ns", 00:07:09.879 "nvmf_subsystem_listener_set_ana_state", 00:07:09.879 "nvmf_discovery_get_referrals", 00:07:09.879 "nvmf_discovery_remove_referral", 00:07:09.879 "nvmf_discovery_add_referral", 00:07:09.879 "nvmf_subsystem_remove_listener", 00:07:09.879 "nvmf_subsystem_add_listener", 00:07:09.879 "nvmf_delete_subsystem", 00:07:09.879 "nvmf_create_subsystem", 00:07:09.879 "nvmf_get_subsystems", 00:07:09.879 "env_dpdk_get_mem_stats", 00:07:09.879 "nbd_get_disks", 00:07:09.879 "nbd_stop_disk", 00:07:09.879 "nbd_start_disk", 00:07:09.879 "ublk_recover_disk", 00:07:09.879 "ublk_get_disks", 00:07:09.879 "ublk_stop_disk", 00:07:09.879 "ublk_start_disk", 00:07:09.879 "ublk_destroy_target", 00:07:09.879 "ublk_create_target", 00:07:09.879 "virtio_blk_create_transport", 00:07:09.879 "virtio_blk_get_transports", 00:07:09.879 "vhost_controller_set_coalescing", 00:07:09.879 "vhost_get_controllers", 00:07:09.879 "vhost_delete_controller", 00:07:09.879 "vhost_create_blk_controller", 00:07:09.879 "vhost_scsi_controller_remove_target", 00:07:09.879 "vhost_scsi_controller_add_target", 00:07:09.879 "vhost_start_scsi_controller", 00:07:09.879 "vhost_create_scsi_controller", 00:07:09.879 "thread_set_cpumask", 00:07:09.879 "framework_get_governor", 00:07:09.879 "framework_get_scheduler", 00:07:09.879 "framework_set_scheduler", 00:07:09.879 "framework_get_reactors", 00:07:09.879 "thread_get_io_channels", 00:07:09.879 "thread_get_pollers", 00:07:09.879 "thread_get_stats", 00:07:09.879 "framework_monitor_context_switch", 00:07:09.879 "spdk_kill_instance", 00:07:09.879 "log_enable_timestamps", 00:07:09.879 "log_get_flags", 00:07:09.879 "log_clear_flag", 00:07:09.879 "log_set_flag", 00:07:09.879 "log_get_level", 00:07:09.879 "log_set_level", 00:07:09.879 "log_get_print_level", 00:07:09.879 "log_set_print_level", 00:07:09.879 "framework_enable_cpumask_locks", 00:07:09.879 "framework_disable_cpumask_locks", 00:07:09.879 "framework_wait_init", 00:07:09.879 "framework_start_init", 00:07:09.879 "scsi_get_devices", 00:07:09.879 "bdev_get_histogram", 00:07:09.879 "bdev_enable_histogram", 00:07:09.879 "bdev_set_qos_limit", 00:07:09.879 "bdev_set_qd_sampling_period", 00:07:09.879 "bdev_get_bdevs", 00:07:09.879 "bdev_reset_iostat", 00:07:09.879 "bdev_get_iostat", 00:07:09.879 "bdev_examine", 00:07:09.879 "bdev_wait_for_examine", 00:07:09.879 "bdev_set_options", 00:07:09.879 "notify_get_notifications", 00:07:09.879 "notify_get_types", 00:07:09.879 "accel_get_stats", 00:07:09.879 "accel_set_options", 00:07:09.879 "accel_set_driver", 00:07:09.879 "accel_crypto_key_destroy", 00:07:09.879 "accel_crypto_keys_get", 00:07:09.879 "accel_crypto_key_create", 00:07:09.879 "accel_assign_opc", 00:07:09.879 "accel_get_module_info", 00:07:09.879 "accel_get_opc_assignments", 00:07:09.879 "vmd_rescan", 00:07:09.879 "vmd_remove_device", 00:07:09.879 "vmd_enable", 00:07:09.879 "sock_get_default_impl", 00:07:09.879 "sock_set_default_impl", 00:07:09.879 "sock_impl_set_options", 00:07:09.879 "sock_impl_get_options", 00:07:09.879 "iobuf_get_stats", 00:07:09.879 "iobuf_set_options", 00:07:09.879 "framework_get_pci_devices", 00:07:09.879 "framework_get_config", 00:07:09.879 "framework_get_subsystems", 00:07:09.879 "trace_get_info", 00:07:09.879 "trace_get_tpoint_group_mask", 00:07:09.879 "trace_disable_tpoint_group", 00:07:09.879 "trace_enable_tpoint_group", 00:07:09.879 "trace_clear_tpoint_mask", 00:07:09.879 "trace_set_tpoint_mask", 00:07:09.879 "keyring_get_keys", 00:07:09.879 "spdk_get_version", 00:07:09.879 "rpc_get_methods" 00:07:09.879 ] 00:07:09.879 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:09.879 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:09.879 09:11:18 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 50853 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 50853 ']' 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 50853 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:09.879 09:11:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 50853 00:07:10.137 09:11:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:10.137 09:11:18 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:10.137 09:11:18 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 50853' 00:07:10.137 killing process with pid 50853 00:07:10.137 09:11:18 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 50853 00:07:10.137 09:11:18 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 50853 00:07:10.395 00:07:10.395 real 0m1.813s 00:07:10.395 user 0m3.267s 00:07:10.395 sys 0m0.611s 00:07:10.395 09:11:19 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.395 09:11:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.395 ************************************ 00:07:10.395 END TEST spdkcli_tcp 00:07:10.395 ************************************ 00:07:10.395 09:11:19 -- common/autotest_common.sh@1142 -- # return 0 00:07:10.395 09:11:19 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.395 09:11:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.395 09:11:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.395 09:11:19 -- common/autotest_common.sh@10 -- # set +x 00:07:10.395 ************************************ 00:07:10.395 START TEST dpdk_mem_utility 00:07:10.395 ************************************ 00:07:10.395 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.653 * Looking for test storage... 00:07:10.653 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:10.653 09:11:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:10.653 09:11:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=51262 00:07:10.653 09:11:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 51262 00:07:10.653 09:11:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 51262 ']' 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:10.653 09:11:19 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:10.653 [2024-07-15 09:11:19.470867] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:10.653 [2024-07-15 09:11:19.470947] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid51262 ] 00:07:10.653 [2024-07-15 09:11:19.603736] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.912 [2024-07-15 09:11:19.701097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.477 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:11.477 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:11.477 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:11.477 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:11.477 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:11.477 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:11.477 { 00:07:11.477 "filename": "/tmp/spdk_mem_dump.txt" 00:07:11.477 } 00:07:11.477 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:11.477 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:11.743 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:11.743 2 heaps totaling size 816.000000 MiB 00:07:11.743 size: 814.000000 MiB heap id: 0 00:07:11.743 size: 2.000000 MiB heap id: 1 00:07:11.743 end heaps---------- 00:07:11.743 8 mempools totaling size 598.116089 MiB 00:07:11.743 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:11.743 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:11.743 size: 84.521057 MiB name: bdev_io_51262 00:07:11.743 size: 51.011292 MiB name: evtpool_51262 00:07:11.743 size: 50.003479 MiB name: msgpool_51262 00:07:11.743 size: 21.763794 MiB name: PDU_Pool 00:07:11.743 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:11.743 size: 0.026123 MiB name: Session_Pool 00:07:11.743 end mempools------- 00:07:11.743 201 memzones totaling size 4.176453 MiB 00:07:11.743 size: 1.000366 MiB name: RG_ring_0_51262 00:07:11.743 size: 1.000366 MiB name: RG_ring_1_51262 00:07:11.743 size: 1.000366 MiB name: RG_ring_4_51262 00:07:11.743 size: 1.000366 MiB name: RG_ring_5_51262 00:07:11.743 size: 0.125366 MiB name: RG_ring_2_51262 00:07:11.743 size: 0.015991 MiB name: RG_ring_3_51262 00:07:11.743 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:11.743 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:11.743 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:11.743 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:11.743 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:11.744 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:11.744 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:11.744 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.744 end memzones------- 00:07:11.744 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:11.744 heap id: 0 total size: 814.000000 MiB number of busy elements: 517 number of free elements: 14 00:07:11.744 list of free elements. size: 11.815186 MiB 00:07:11.744 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:11.744 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:11.744 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:11.744 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:11.744 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:11.744 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:11.744 element at address: 0x200007000000 with size: 0.960022 MiB 00:07:11.744 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:11.744 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:11.744 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:11.744 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:11.744 element at address: 0x200000800000 with size: 0.486694 MiB 00:07:11.744 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:11.744 element at address: 0x200027e00000 with size: 0.403259 MiB 00:07:11.744 list of standard malloc elements. size: 199.876526 MiB 00:07:11.744 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:11.744 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:11.744 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:11.744 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:11.744 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:11.744 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:11.744 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:11.744 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:11.744 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:11.744 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:11.744 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:11.744 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:11.744 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:11.744 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:11.744 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:11.745 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:11.745 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:11.745 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:11.745 element at address: 0x200000204d40 with size: 0.000305 MiB 00:07:11.745 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204c80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204e80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:11.745 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:11.745 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b900 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:11.746 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e673c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e67480 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e080 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:11.747 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:11.747 list of memzone associated elements. size: 602.308289 MiB 00:07:11.747 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:11.747 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:11.747 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:11.747 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:11.747 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:11.747 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_51262_0 00:07:11.747 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:11.747 associated memzone info: size: 48.002930 MiB name: MP_evtpool_51262_0 00:07:11.747 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:11.747 associated memzone info: size: 48.002930 MiB name: MP_msgpool_51262_0 00:07:11.747 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:11.747 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:11.747 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:11.747 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:11.747 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:11.747 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_51262 00:07:11.747 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:11.747 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_51262 00:07:11.747 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:11.747 associated memzone info: size: 1.007996 MiB name: MP_evtpool_51262 00:07:11.747 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:11.747 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:11.747 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:11.747 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:11.747 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:11.747 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:11.747 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:11.747 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:11.747 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:11.747 associated memzone info: size: 1.000366 MiB name: RG_ring_0_51262 00:07:11.747 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:11.747 associated memzone info: size: 1.000366 MiB name: RG_ring_1_51262 00:07:11.747 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:11.747 associated memzone info: size: 1.000366 MiB name: RG_ring_4_51262 00:07:11.747 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:11.747 associated memzone info: size: 1.000366 MiB name: RG_ring_5_51262 00:07:11.747 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:11.747 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_51262 00:07:11.747 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:11.747 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:11.747 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:11.747 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:11.747 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:11.747 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:11.747 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:11.747 associated memzone info: size: 0.125366 MiB name: RG_ring_2_51262 00:07:11.747 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:07:11.747 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:11.747 element at address: 0x200027e67540 with size: 0.023743 MiB 00:07:11.747 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:11.747 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:11.748 associated memzone info: size: 0.015991 MiB name: RG_ring_3_51262 00:07:11.748 element at address: 0x200027e6d680 with size: 0.002441 MiB 00:07:11.748 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:11.748 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:11.748 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.748 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:11.748 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:11.748 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:11.748 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:11.748 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:11.748 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:11.748 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:11.748 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:11.748 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:11.748 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:11.748 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:11.748 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:11.748 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:11.748 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:11.748 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:11.748 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:11.748 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:11.748 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:11.748 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:11.748 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:11.748 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:11.748 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:11.748 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:11.748 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:11.748 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:11.748 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:11.748 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:11.748 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:11.748 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:11.748 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:11.748 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:11.748 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:11.748 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:11.748 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:11.748 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:11.748 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:11.748 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:11.748 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:11.748 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:11.748 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:11.748 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:11.748 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:11.748 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:11.748 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:11.748 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:11.748 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:11.748 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:11.748 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:11.748 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:11.748 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:11.748 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.748 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:07:11.748 associated memzone info: size: 0.000183 MiB name: MP_msgpool_51262 00:07:11.748 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:11.748 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_51262 00:07:11.748 element at address: 0x200027e6e140 with size: 0.000305 MiB 00:07:11.748 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:11.748 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:11.748 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:11.748 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:11.748 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:11.748 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:11.748 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:11.748 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:11.748 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:11.748 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:11.748 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:11.748 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:11.748 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:11.748 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:11.748 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:11.748 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:11.748 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:11.748 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:11.748 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:11.748 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:11.748 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:11.748 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:11.748 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:11.748 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:11.748 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:11.749 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:11.749 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:11.749 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:11.749 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:11.749 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:11.749 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:11.749 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:11.749 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:11.749 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:11.749 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:11.749 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:11.749 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:11.749 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:11.749 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:11.749 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:11.749 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:11.749 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:11.749 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:11.749 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:11.749 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:11.749 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:11.749 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:11.749 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:11.749 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:11.749 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:11.749 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:11.749 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:11.749 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:11.749 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:11.749 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:11.749 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:11.749 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:11.749 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:11.749 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:11.749 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:11.749 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:11.749 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:11.749 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:11.749 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:11.749 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:11.749 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:11.749 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:11.749 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:11.749 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:11.749 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:11.749 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:11.749 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:11.749 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:11.749 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:11.749 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:11.749 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:11.749 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:11.749 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:11.749 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:11.749 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:11.749 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:11.749 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:11.749 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:11.749 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:11.749 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:11.749 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:11.749 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:11.749 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:11.749 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:11.749 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:11.749 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:11.749 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:11.749 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:11.749 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:11.749 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:11.749 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:11.749 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:11.749 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:11.749 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:11.749 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:11.749 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.749 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:11.749 09:11:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 51262 00:07:11.749 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 51262 ']' 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 51262 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 51262 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 51262' 00:07:11.750 killing process with pid 51262 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 51262 00:07:11.750 09:11:20 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 51262 00:07:12.378 00:07:12.378 real 0m1.767s 00:07:12.378 user 0m1.930s 00:07:12.378 sys 0m0.560s 00:07:12.378 09:11:21 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.378 09:11:21 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:12.378 ************************************ 00:07:12.378 END TEST dpdk_mem_utility 00:07:12.378 ************************************ 00:07:12.378 09:11:21 -- common/autotest_common.sh@1142 -- # return 0 00:07:12.378 09:11:21 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:12.378 09:11:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.378 09:11:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.378 09:11:21 -- common/autotest_common.sh@10 -- # set +x 00:07:12.378 ************************************ 00:07:12.378 START TEST event 00:07:12.378 ************************************ 00:07:12.378 09:11:21 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:12.378 * Looking for test storage... 00:07:12.378 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:12.379 09:11:21 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:12.379 09:11:21 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:12.379 09:11:21 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:12.379 09:11:21 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:12.379 09:11:21 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.379 09:11:21 event -- common/autotest_common.sh@10 -- # set +x 00:07:12.379 ************************************ 00:07:12.379 START TEST event_perf 00:07:12.379 ************************************ 00:07:12.379 09:11:21 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:12.379 Running I/O for 1 seconds...[2024-07-15 09:11:21.307782] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:12.379 [2024-07-15 09:11:21.307848] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid51503 ] 00:07:12.643 [2024-07-15 09:11:21.437180] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.643 [2024-07-15 09:11:21.537920] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.643 [2024-07-15 09:11:21.538005] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.643 [2024-07-15 09:11:21.538031] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:12.643 [2024-07-15 09:11:21.538035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.021 Running I/O for 1 seconds... 00:07:14.021 lcore 0: 178423 00:07:14.021 lcore 1: 178421 00:07:14.021 lcore 2: 178423 00:07:14.021 lcore 3: 178424 00:07:14.021 done. 00:07:14.021 00:07:14.021 real 0m1.347s 00:07:14.021 user 0m4.197s 00:07:14.021 sys 0m0.143s 00:07:14.021 09:11:22 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.021 09:11:22 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:14.021 ************************************ 00:07:14.021 END TEST event_perf 00:07:14.021 ************************************ 00:07:14.021 09:11:22 event -- common/autotest_common.sh@1142 -- # return 0 00:07:14.021 09:11:22 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:14.021 09:11:22 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:14.021 09:11:22 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.021 09:11:22 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.021 ************************************ 00:07:14.021 START TEST event_reactor 00:07:14.021 ************************************ 00:07:14.021 09:11:22 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:14.021 [2024-07-15 09:11:22.732843] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:14.021 [2024-07-15 09:11:22.732912] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid51704 ] 00:07:14.021 [2024-07-15 09:11:22.864187] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.021 [2024-07-15 09:11:22.962975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.395 test_start 00:07:15.396 oneshot 00:07:15.396 tick 100 00:07:15.396 tick 100 00:07:15.396 tick 250 00:07:15.396 tick 100 00:07:15.396 tick 100 00:07:15.396 tick 250 00:07:15.396 tick 100 00:07:15.396 tick 500 00:07:15.396 tick 100 00:07:15.396 tick 100 00:07:15.396 tick 250 00:07:15.396 tick 100 00:07:15.396 tick 100 00:07:15.396 test_end 00:07:15.396 00:07:15.396 real 0m1.343s 00:07:15.396 user 0m1.202s 00:07:15.396 sys 0m0.134s 00:07:15.396 09:11:24 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.396 09:11:24 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:15.396 ************************************ 00:07:15.396 END TEST event_reactor 00:07:15.396 ************************************ 00:07:15.396 09:11:24 event -- common/autotest_common.sh@1142 -- # return 0 00:07:15.396 09:11:24 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:15.396 09:11:24 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:15.396 09:11:24 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.396 09:11:24 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.396 ************************************ 00:07:15.396 START TEST event_reactor_perf 00:07:15.396 ************************************ 00:07:15.396 09:11:24 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:15.396 [2024-07-15 09:11:24.150545] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:15.396 [2024-07-15 09:11:24.150614] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid51902 ] 00:07:15.396 [2024-07-15 09:11:24.285122] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.653 [2024-07-15 09:11:24.391573] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.588 test_start 00:07:16.588 test_end 00:07:16.588 Performance: 328416 events per second 00:07:16.588 00:07:16.588 real 0m1.359s 00:07:16.588 user 0m1.213s 00:07:16.588 sys 0m0.139s 00:07:16.588 09:11:25 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.588 09:11:25 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:16.588 ************************************ 00:07:16.588 END TEST event_reactor_perf 00:07:16.588 ************************************ 00:07:16.588 09:11:25 event -- common/autotest_common.sh@1142 -- # return 0 00:07:16.588 09:11:25 event -- event/event.sh@49 -- # uname -s 00:07:16.588 09:11:25 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:16.588 09:11:25 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.588 09:11:25 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:16.588 09:11:25 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.588 09:11:25 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.846 ************************************ 00:07:16.846 START TEST event_scheduler 00:07:16.847 ************************************ 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.847 * Looking for test storage... 00:07:16.847 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:16.847 09:11:25 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:16.847 09:11:25 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=52126 00:07:16.847 09:11:25 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:16.847 09:11:25 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:16.847 09:11:25 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 52126 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 52126 ']' 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.847 09:11:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:16.847 [2024-07-15 09:11:25.704040] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:16.847 [2024-07-15 09:11:25.704111] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid52126 ] 00:07:17.105 [2024-07-15 09:11:25.805489] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:17.105 [2024-07-15 09:11:25.889814] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.105 [2024-07-15 09:11:25.889889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.105 [2024-07-15 09:11:25.889968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:17.105 [2024-07-15 09:11:25.889970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:18.041 09:11:26 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 [2024-07-15 09:11:26.656806] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:18.041 [2024-07-15 09:11:26.656829] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:18.041 [2024-07-15 09:11:26.656841] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:18.041 [2024-07-15 09:11:26.656849] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:18.041 [2024-07-15 09:11:26.656857] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 [2024-07-15 09:11:26.747749] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 ************************************ 00:07:18.041 START TEST scheduler_create_thread 00:07:18.041 ************************************ 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 2 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 3 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 4 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 5 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.041 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.041 6 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.042 7 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.042 8 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.042 9 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.042 10 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.042 09:11:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.609 09:11:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.609 09:11:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:18.609 09:11:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.609 09:11:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.986 09:11:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.987 09:11:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:19.987 09:11:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:19.987 09:11:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.987 09:11:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.362 09:11:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.362 00:07:21.362 real 0m3.101s 00:07:21.362 user 0m0.019s 00:07:21.362 sys 0m0.012s 00:07:21.362 09:11:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.362 09:11:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.362 ************************************ 00:07:21.362 END TEST scheduler_create_thread 00:07:21.362 ************************************ 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:21.362 09:11:29 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:21.362 09:11:29 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 52126 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 52126 ']' 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 52126 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 52126 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 52126' 00:07:21.362 killing process with pid 52126 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 52126 00:07:21.362 09:11:29 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 52126 00:07:21.362 [2024-07-15 09:11:30.267018] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:21.622 00:07:21.622 real 0m4.942s 00:07:21.622 user 0m9.764s 00:07:21.622 sys 0m0.481s 00:07:21.622 09:11:30 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.622 09:11:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:21.622 ************************************ 00:07:21.622 END TEST event_scheduler 00:07:21.622 ************************************ 00:07:21.622 09:11:30 event -- common/autotest_common.sh@1142 -- # return 0 00:07:21.622 09:11:30 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:21.622 09:11:30 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:21.622 09:11:30 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:21.622 09:11:30 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.622 09:11:30 event -- common/autotest_common.sh@10 -- # set +x 00:07:21.882 ************************************ 00:07:21.882 START TEST app_repeat 00:07:21.882 ************************************ 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@19 -- # repeat_pid=52877 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 52877' 00:07:21.882 Process app_repeat pid: 52877 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:21.882 spdk_app_start Round 0 00:07:21.882 09:11:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 52877 /var/tmp/spdk-nbd.sock 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 52877 ']' 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:21.882 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:21.882 09:11:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:21.882 [2024-07-15 09:11:30.629468] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:21.882 [2024-07-15 09:11:30.629537] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid52877 ] 00:07:21.882 [2024-07-15 09:11:30.762772] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.140 [2024-07-15 09:11:30.866657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.140 [2024-07-15 09:11:30.866662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.707 09:11:31 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:22.707 09:11:31 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:22.707 09:11:31 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:22.966 Malloc0 00:07:22.966 09:11:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.224 Malloc1 00:07:23.224 09:11:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.224 09:11:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.224 09:11:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.225 09:11:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:23.483 /dev/nbd0 00:07:23.483 09:11:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:23.483 09:11:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.483 1+0 records in 00:07:23.483 1+0 records out 00:07:23.483 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226314 s, 18.1 MB/s 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:23.483 09:11:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:23.483 09:11:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.483 09:11:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.483 09:11:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:23.741 /dev/nbd1 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.741 1+0 records in 00:07:23.741 1+0 records out 00:07:23.741 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269732 s, 15.2 MB/s 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:23.741 09:11:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.741 09:11:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:23.999 { 00:07:23.999 "nbd_device": "/dev/nbd0", 00:07:23.999 "bdev_name": "Malloc0" 00:07:23.999 }, 00:07:23.999 { 00:07:23.999 "nbd_device": "/dev/nbd1", 00:07:23.999 "bdev_name": "Malloc1" 00:07:23.999 } 00:07:23.999 ]' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:23.999 { 00:07:23.999 "nbd_device": "/dev/nbd0", 00:07:23.999 "bdev_name": "Malloc0" 00:07:23.999 }, 00:07:23.999 { 00:07:23.999 "nbd_device": "/dev/nbd1", 00:07:23.999 "bdev_name": "Malloc1" 00:07:23.999 } 00:07:23.999 ]' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:23.999 /dev/nbd1' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:23.999 /dev/nbd1' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:23.999 256+0 records in 00:07:23.999 256+0 records out 00:07:23.999 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00497391 s, 211 MB/s 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:23.999 256+0 records in 00:07:23.999 256+0 records out 00:07:23.999 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0180707 s, 58.0 MB/s 00:07:23.999 09:11:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.000 256+0 records in 00:07:24.000 256+0 records out 00:07:24.000 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199973 s, 52.4 MB/s 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.000 09:11:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.258 09:11:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.515 09:11:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:24.772 09:11:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:25.030 09:11:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:25.030 09:11:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:25.288 09:11:34 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:25.546 [2024-07-15 09:11:34.267547] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.546 [2024-07-15 09:11:34.365417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.546 [2024-07-15 09:11:34.365421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.546 [2024-07-15 09:11:34.417570] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:25.546 [2024-07-15 09:11:34.417623] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:28.823 spdk_app_start Round 1 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@25 -- # waitforlisten 52877 /var/tmp/spdk-nbd.sock 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 52877 ']' 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:28.823 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:28.823 09:11:37 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.823 Malloc0 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.823 Malloc1 00:07:28.823 09:11:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:28.823 09:11:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.119 /dev/nbd0 00:07:29.119 09:11:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.119 09:11:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.119 1+0 records in 00:07:29.119 1+0 records out 00:07:29.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000163623 s, 25.0 MB/s 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:29.119 09:11:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:29.119 09:11:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.119 09:11:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.119 09:11:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:29.377 /dev/nbd1 00:07:29.377 09:11:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:29.377 09:11:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.377 1+0 records in 00:07:29.377 1+0 records out 00:07:29.377 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287242 s, 14.3 MB/s 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:29.377 09:11:38 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:29.378 09:11:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.378 09:11:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.378 09:11:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.378 09:11:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.378 09:11:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:29.636 { 00:07:29.636 "nbd_device": "/dev/nbd0", 00:07:29.636 "bdev_name": "Malloc0" 00:07:29.636 }, 00:07:29.636 { 00:07:29.636 "nbd_device": "/dev/nbd1", 00:07:29.636 "bdev_name": "Malloc1" 00:07:29.636 } 00:07:29.636 ]' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:29.636 { 00:07:29.636 "nbd_device": "/dev/nbd0", 00:07:29.636 "bdev_name": "Malloc0" 00:07:29.636 }, 00:07:29.636 { 00:07:29.636 "nbd_device": "/dev/nbd1", 00:07:29.636 "bdev_name": "Malloc1" 00:07:29.636 } 00:07:29.636 ]' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:29.636 /dev/nbd1' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:29.636 /dev/nbd1' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:29.636 256+0 records in 00:07:29.636 256+0 records out 00:07:29.636 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115189 s, 91.0 MB/s 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:29.636 256+0 records in 00:07:29.636 256+0 records out 00:07:29.636 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202593 s, 51.8 MB/s 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:29.636 256+0 records in 00:07:29.636 256+0 records out 00:07:29.636 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203363 s, 51.6 MB/s 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.636 09:11:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.896 09:11:38 event.app_repeat -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.154 09:11:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.413 09:11:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:30.672 09:11:39 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:30.672 09:11:39 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:30.931 09:11:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:31.190 [2024-07-15 09:11:40.021758] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:31.190 [2024-07-15 09:11:40.131724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.190 [2024-07-15 09:11:40.131730] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.449 [2024-07-15 09:11:40.181125] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:31.449 [2024-07-15 09:11:40.181178] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:34.004 09:11:42 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:34.004 09:11:42 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:34.004 spdk_app_start Round 2 00:07:34.004 09:11:42 event.app_repeat -- event/event.sh@25 -- # waitforlisten 52877 /var/tmp/spdk-nbd.sock 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 52877 ']' 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:34.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:34.004 09:11:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:34.263 09:11:43 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.263 09:11:43 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:34.263 09:11:43 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.520 Malloc0 00:07:34.520 09:11:43 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.778 Malloc1 00:07:34.778 09:11:43 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.778 09:11:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:35.037 /dev/nbd0 00:07:35.037 09:11:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:35.037 09:11:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:35.037 1+0 records in 00:07:35.037 1+0 records out 00:07:35.037 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000214784 s, 19.1 MB/s 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:35.037 09:11:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:35.037 09:11:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.037 09:11:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:35.037 09:11:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:35.296 /dev/nbd1 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:35.296 1+0 records in 00:07:35.296 1+0 records out 00:07:35.296 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233567 s, 17.5 MB/s 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:35.296 09:11:44 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.296 09:11:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:35.556 { 00:07:35.556 "nbd_device": "/dev/nbd0", 00:07:35.556 "bdev_name": "Malloc0" 00:07:35.556 }, 00:07:35.556 { 00:07:35.556 "nbd_device": "/dev/nbd1", 00:07:35.556 "bdev_name": "Malloc1" 00:07:35.556 } 00:07:35.556 ]' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:35.556 { 00:07:35.556 "nbd_device": "/dev/nbd0", 00:07:35.556 "bdev_name": "Malloc0" 00:07:35.556 }, 00:07:35.556 { 00:07:35.556 "nbd_device": "/dev/nbd1", 00:07:35.556 "bdev_name": "Malloc1" 00:07:35.556 } 00:07:35.556 ]' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:35.556 /dev/nbd1' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:35.556 /dev/nbd1' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:35.556 09:11:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:35.557 256+0 records in 00:07:35.557 256+0 records out 00:07:35.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103997 s, 101 MB/s 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:35.557 256+0 records in 00:07:35.557 256+0 records out 00:07:35.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0187125 s, 56.0 MB/s 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:35.557 256+0 records in 00:07:35.557 256+0 records out 00:07:35.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203775 s, 51.5 MB/s 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:35.557 09:11:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.815 09:11:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:35.816 09:11:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.816 09:11:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.816 09:11:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:35.816 09:11:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:36.382 09:11:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:36.663 09:11:45 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:36.663 09:11:45 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:36.921 09:11:45 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:36.921 [2024-07-15 09:11:45.861985] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:37.180 [2024-07-15 09:11:45.961042] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:37.180 [2024-07-15 09:11:45.961047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.180 [2024-07-15 09:11:46.013360] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:37.180 [2024-07-15 09:11:46.013416] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:39.707 09:11:48 event.app_repeat -- event/event.sh@38 -- # waitforlisten 52877 /var/tmp/spdk-nbd.sock 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 52877 ']' 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:39.707 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:39.707 09:11:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:39.964 09:11:48 event.app_repeat -- event/event.sh@39 -- # killprocess 52877 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 52877 ']' 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 52877 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 52877 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:39.964 09:11:48 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:39.965 09:11:48 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 52877' 00:07:39.965 killing process with pid 52877 00:07:40.222 09:11:48 event.app_repeat -- common/autotest_common.sh@967 -- # kill 52877 00:07:40.222 09:11:48 event.app_repeat -- common/autotest_common.sh@972 -- # wait 52877 00:07:40.222 spdk_app_start is called in Round 0. 00:07:40.222 Shutdown signal received, stop current app iteration 00:07:40.222 Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 reinitialization... 00:07:40.222 spdk_app_start is called in Round 1. 00:07:40.222 Shutdown signal received, stop current app iteration 00:07:40.222 Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 reinitialization... 00:07:40.222 spdk_app_start is called in Round 2. 00:07:40.222 Shutdown signal received, stop current app iteration 00:07:40.222 Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 reinitialization... 00:07:40.222 spdk_app_start is called in Round 3. 00:07:40.222 Shutdown signal received, stop current app iteration 00:07:40.222 09:11:49 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:40.222 09:11:49 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:40.222 00:07:40.222 real 0m18.534s 00:07:40.222 user 0m39.834s 00:07:40.222 sys 0m3.887s 00:07:40.222 09:11:49 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.222 09:11:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:40.222 ************************************ 00:07:40.222 END TEST app_repeat 00:07:40.222 ************************************ 00:07:40.222 09:11:49 event -- common/autotest_common.sh@1142 -- # return 0 00:07:40.222 09:11:49 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:40.222 00:07:40.222 real 0m28.012s 00:07:40.222 user 0m56.388s 00:07:40.222 sys 0m5.131s 00:07:40.222 09:11:49 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.222 09:11:49 event -- common/autotest_common.sh@10 -- # set +x 00:07:40.222 ************************************ 00:07:40.222 END TEST event 00:07:40.222 ************************************ 00:07:40.480 09:11:49 -- common/autotest_common.sh@1142 -- # return 0 00:07:40.480 09:11:49 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:40.480 09:11:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:40.480 09:11:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.480 09:11:49 -- common/autotest_common.sh@10 -- # set +x 00:07:40.480 ************************************ 00:07:40.480 START TEST thread 00:07:40.480 ************************************ 00:07:40.480 09:11:49 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:40.480 * Looking for test storage... 00:07:40.480 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:40.480 09:11:49 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:40.480 09:11:49 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:40.480 09:11:49 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.480 09:11:49 thread -- common/autotest_common.sh@10 -- # set +x 00:07:40.480 ************************************ 00:07:40.480 START TEST thread_poller_perf 00:07:40.480 ************************************ 00:07:40.480 09:11:49 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:40.480 [2024-07-15 09:11:49.404737] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:40.480 [2024-07-15 09:11:49.404789] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid55586 ] 00:07:40.737 [2024-07-15 09:11:49.516585] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.737 [2024-07-15 09:11:49.617971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.737 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:42.109 ====================================== 00:07:42.109 busy:2307094776 (cyc) 00:07:42.109 total_run_count: 266000 00:07:42.109 tsc_hz: 2300000000 (cyc) 00:07:42.109 ====================================== 00:07:42.109 poller_cost: 8673 (cyc), 3770 (nsec) 00:07:42.109 00:07:42.109 real 0m1.329s 00:07:42.109 user 0m1.203s 00:07:42.109 sys 0m0.120s 00:07:42.109 09:11:50 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.109 09:11:50 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:42.109 ************************************ 00:07:42.109 END TEST thread_poller_perf 00:07:42.109 ************************************ 00:07:42.109 09:11:50 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:42.109 09:11:50 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:42.109 09:11:50 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:42.109 09:11:50 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.109 09:11:50 thread -- common/autotest_common.sh@10 -- # set +x 00:07:42.109 ************************************ 00:07:42.109 START TEST thread_poller_perf 00:07:42.109 ************************************ 00:07:42.109 09:11:50 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:42.109 [2024-07-15 09:11:50.831051] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:42.109 [2024-07-15 09:11:50.831115] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid55779 ] 00:07:42.109 [2024-07-15 09:11:50.948903] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.109 [2024-07-15 09:11:51.048908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.110 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:43.483 ====================================== 00:07:43.483 busy:2302781308 (cyc) 00:07:43.483 total_run_count: 3498000 00:07:43.483 tsc_hz: 2300000000 (cyc) 00:07:43.483 ====================================== 00:07:43.483 poller_cost: 658 (cyc), 286 (nsec) 00:07:43.483 00:07:43.483 real 0m1.339s 00:07:43.483 user 0m1.202s 00:07:43.483 sys 0m0.132s 00:07:43.483 09:11:52 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.483 09:11:52 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:43.483 ************************************ 00:07:43.483 END TEST thread_poller_perf 00:07:43.483 ************************************ 00:07:43.483 09:11:52 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:43.483 09:11:52 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:43.483 00:07:43.483 real 0m2.939s 00:07:43.483 user 0m2.507s 00:07:43.483 sys 0m0.441s 00:07:43.483 09:11:52 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.483 09:11:52 thread -- common/autotest_common.sh@10 -- # set +x 00:07:43.483 ************************************ 00:07:43.483 END TEST thread 00:07:43.483 ************************************ 00:07:43.483 09:11:52 -- common/autotest_common.sh@1142 -- # return 0 00:07:43.483 09:11:52 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:43.483 09:11:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:43.483 09:11:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.483 09:11:52 -- common/autotest_common.sh@10 -- # set +x 00:07:43.483 ************************************ 00:07:43.483 START TEST accel 00:07:43.483 ************************************ 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:43.483 * Looking for test storage... 00:07:43.483 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:43.483 09:11:52 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:43.483 09:11:52 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:43.483 09:11:52 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:43.483 09:11:52 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=56021 00:07:43.483 09:11:52 accel -- accel/accel.sh@63 -- # waitforlisten 56021 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@829 -- # '[' -z 56021 ']' 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:43.483 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:43.483 09:11:52 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:43.483 09:11:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.483 09:11:52 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:43.484 09:11:52 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.484 09:11:52 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.484 09:11:52 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.484 09:11:52 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.484 09:11:52 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.484 09:11:52 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:43.484 09:11:52 accel -- accel/accel.sh@41 -- # jq -r . 00:07:43.484 [2024-07-15 09:11:52.434491] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:43.484 [2024-07-15 09:11:52.434565] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56021 ] 00:07:43.743 [2024-07-15 09:11:52.562247] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.743 [2024-07-15 09:11:52.661625] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.311 09:11:53 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:44.311 09:11:53 accel -- common/autotest_common.sh@862 -- # return 0 00:07:44.311 09:11:53 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:44.311 09:11:53 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:44.311 09:11:53 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:44.311 09:11:53 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:44.311 09:11:53 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:44.311 09:11:53 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:44.311 09:11:53 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:44.311 09:11:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.311 09:11:53 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:44.311 09:11:53 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.311 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.311 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.311 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.569 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.569 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.569 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.569 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.569 09:11:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:44.569 09:11:53 accel -- accel/accel.sh@72 -- # IFS== 00:07:44.569 09:11:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:44.569 09:11:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:44.569 09:11:53 accel -- accel/accel.sh@75 -- # killprocess 56021 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@948 -- # '[' -z 56021 ']' 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@952 -- # kill -0 56021 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@953 -- # uname 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 56021 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 56021' 00:07:44.569 killing process with pid 56021 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@967 -- # kill 56021 00:07:44.569 09:11:53 accel -- common/autotest_common.sh@972 -- # wait 56021 00:07:44.827 09:11:53 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:44.827 09:11:53 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:44.827 09:11:53 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:44.827 09:11:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.827 09:11:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.827 09:11:53 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:44.827 09:11:53 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:44.827 09:11:53 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.827 09:11:53 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:45.086 09:11:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.086 09:11:53 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:45.086 09:11:53 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:45.086 09:11:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.086 09:11:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.086 ************************************ 00:07:45.086 START TEST accel_missing_filename 00:07:45.086 ************************************ 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:45.086 09:11:53 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:45.086 09:11:53 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:45.086 [2024-07-15 09:11:53.894671] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:45.086 [2024-07-15 09:11:53.894737] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56235 ] 00:07:45.087 [2024-07-15 09:11:54.028510] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.345 [2024-07-15 09:11:54.135529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.345 [2024-07-15 09:11:54.207513] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:45.345 [2024-07-15 09:11:54.280227] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:45.646 A filename is required. 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:45.646 00:07:45.646 real 0m0.527s 00:07:45.646 user 0m0.340s 00:07:45.646 sys 0m0.211s 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:45.646 09:11:54 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:45.646 ************************************ 00:07:45.646 END TEST accel_missing_filename 00:07:45.646 ************************************ 00:07:45.646 09:11:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.646 09:11:54 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.646 09:11:54 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:45.646 09:11:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.646 09:11:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.646 ************************************ 00:07:45.646 START TEST accel_compress_verify 00:07:45.646 ************************************ 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:45.646 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:45.646 09:11:54 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:45.646 [2024-07-15 09:11:54.502394] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:45.646 [2024-07-15 09:11:54.502456] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56425 ] 00:07:45.905 [2024-07-15 09:11:54.620457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.905 [2024-07-15 09:11:54.723336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.905 [2024-07-15 09:11:54.792856] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:46.164 [2024-07-15 09:11:54.867838] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:46.164 00:07:46.164 Compression does not support the verify option, aborting. 00:07:46.164 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:46.164 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:46.164 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:46.165 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:46.165 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:46.165 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:46.165 00:07:46.165 real 0m0.499s 00:07:46.165 user 0m0.339s 00:07:46.165 sys 0m0.191s 00:07:46.165 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.165 09:11:54 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:46.165 ************************************ 00:07:46.165 END TEST accel_compress_verify 00:07:46.165 ************************************ 00:07:46.165 09:11:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.165 09:11:55 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:46.165 09:11:55 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:46.165 09:11:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.165 09:11:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.165 ************************************ 00:07:46.165 START TEST accel_wrong_workload 00:07:46.165 ************************************ 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:46.165 09:11:55 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:46.165 Unsupported workload type: foobar 00:07:46.165 [2024-07-15 09:11:55.078566] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:46.165 accel_perf options: 00:07:46.165 [-h help message] 00:07:46.165 [-q queue depth per core] 00:07:46.165 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:46.165 [-T number of threads per core 00:07:46.165 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:46.165 [-t time in seconds] 00:07:46.165 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:46.165 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:46.165 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:46.165 [-l for compress/decompress workloads, name of uncompressed input file 00:07:46.165 [-S for crc32c workload, use this seed value (default 0) 00:07:46.165 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:46.165 [-f for fill workload, use this BYTE value (default 255) 00:07:46.165 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:46.165 [-y verify result if this switch is on] 00:07:46.165 [-a tasks to allocate per core (default: same value as -q)] 00:07:46.165 Can be used to spread operations across a wider range of memory. 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:46.165 00:07:46.165 real 0m0.043s 00:07:46.165 user 0m0.026s 00:07:46.165 sys 0m0.017s 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.165 09:11:55 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:46.165 ************************************ 00:07:46.165 END TEST accel_wrong_workload 00:07:46.165 ************************************ 00:07:46.165 Error: writing output failed: Broken pipe 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.424 09:11:55 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.424 ************************************ 00:07:46.424 START TEST accel_negative_buffers 00:07:46.424 ************************************ 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:46.424 09:11:55 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:46.424 -x option must be non-negative. 00:07:46.424 [2024-07-15 09:11:55.205124] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:46.424 accel_perf options: 00:07:46.424 [-h help message] 00:07:46.424 [-q queue depth per core] 00:07:46.424 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:46.424 [-T number of threads per core 00:07:46.424 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:46.424 [-t time in seconds] 00:07:46.424 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:46.424 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:46.424 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:46.424 [-l for compress/decompress workloads, name of uncompressed input file 00:07:46.424 [-S for crc32c workload, use this seed value (default 0) 00:07:46.424 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:46.424 [-f for fill workload, use this BYTE value (default 255) 00:07:46.424 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:46.424 [-y verify result if this switch is on] 00:07:46.424 [-a tasks to allocate per core (default: same value as -q)] 00:07:46.424 Can be used to spread operations across a wider range of memory. 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:46.424 00:07:46.424 real 0m0.045s 00:07:46.424 user 0m0.025s 00:07:46.424 sys 0m0.020s 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.424 09:11:55 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:46.424 ************************************ 00:07:46.424 END TEST accel_negative_buffers 00:07:46.424 ************************************ 00:07:46.424 Error: writing output failed: Broken pipe 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.424 09:11:55 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.424 09:11:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.424 ************************************ 00:07:46.424 START TEST accel_crc32c 00:07:46.424 ************************************ 00:07:46.424 09:11:55 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:46.424 09:11:55 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:46.424 [2024-07-15 09:11:55.324951] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:46.424 [2024-07-15 09:11:55.325015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56498 ] 00:07:46.683 [2024-07-15 09:11:55.454804] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.683 [2024-07-15 09:11:55.555408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.683 09:11:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.059 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.059 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.059 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:48.060 09:11:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:48.060 00:07:48.060 real 0m1.494s 00:07:48.060 user 0m0.011s 00:07:48.060 sys 0m0.003s 00:07:48.060 09:11:56 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.060 09:11:56 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:48.060 ************************************ 00:07:48.060 END TEST accel_crc32c 00:07:48.060 ************************************ 00:07:48.060 09:11:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:48.060 09:11:56 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:48.060 09:11:56 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:48.060 09:11:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.060 09:11:56 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.060 ************************************ 00:07:48.060 START TEST accel_crc32c_C2 00:07:48.060 ************************************ 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:48.060 09:11:56 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:48.060 [2024-07-15 09:11:56.879786] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:48.060 [2024-07-15 09:11:56.879847] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56727 ] 00:07:48.060 [2024-07-15 09:11:57.007265] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.319 [2024-07-15 09:11:57.109296] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.319 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.320 09:11:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.696 00:07:49.696 real 0m1.503s 00:07:49.696 user 0m0.010s 00:07:49.696 sys 0m0.002s 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.696 09:11:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:49.696 ************************************ 00:07:49.696 END TEST accel_crc32c_C2 00:07:49.696 ************************************ 00:07:49.696 09:11:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.696 09:11:58 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:49.696 09:11:58 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:49.696 09:11:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.696 09:11:58 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.696 ************************************ 00:07:49.696 START TEST accel_copy 00:07:49.696 ************************************ 00:07:49.696 09:11:58 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:49.696 09:11:58 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:49.696 [2024-07-15 09:11:58.457601] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:49.696 [2024-07-15 09:11:58.457664] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57036 ] 00:07:49.696 [2024-07-15 09:11:58.586856] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.955 [2024-07-15 09:11:58.687992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.955 09:11:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:51.332 09:11:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.332 00:07:51.332 real 0m1.506s 00:07:51.332 user 0m0.009s 00:07:51.332 sys 0m0.002s 00:07:51.332 09:11:59 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:51.332 09:11:59 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:51.332 ************************************ 00:07:51.332 END TEST accel_copy 00:07:51.332 ************************************ 00:07:51.332 09:11:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:51.332 09:11:59 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:51.332 09:11:59 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:51.332 09:11:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.332 09:11:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.332 ************************************ 00:07:51.332 START TEST accel_fill 00:07:51.332 ************************************ 00:07:51.332 09:12:00 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:51.332 09:12:00 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:51.332 [2024-07-15 09:12:00.037100] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:51.332 [2024-07-15 09:12:00.037166] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57244 ] 00:07:51.332 [2024-07-15 09:12:00.164793] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.332 [2024-07-15 09:12:00.269143] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.591 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:51.592 09:12:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.967 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:52.968 09:12:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.968 00:07:52.968 real 0m1.500s 00:07:52.968 user 0m0.011s 00:07:52.968 sys 0m0.001s 00:07:52.968 09:12:01 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.968 09:12:01 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:52.968 ************************************ 00:07:52.968 END TEST accel_fill 00:07:52.968 ************************************ 00:07:52.968 09:12:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.968 09:12:01 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:52.968 09:12:01 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:52.968 09:12:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.968 09:12:01 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.968 ************************************ 00:07:52.968 START TEST accel_copy_crc32c 00:07:52.968 ************************************ 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:52.968 [2024-07-15 09:12:01.615516] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:52.968 [2024-07-15 09:12:01.615583] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57548 ] 00:07:52.968 [2024-07-15 09:12:01.747177] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.968 [2024-07-15 09:12:01.855502] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:52.968 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:53.226 09:12:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.158 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.158 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.158 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.159 00:07:54.159 real 0m1.505s 00:07:54.159 user 0m0.012s 00:07:54.159 sys 0m0.002s 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.159 09:12:03 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:54.159 ************************************ 00:07:54.159 END TEST accel_copy_crc32c 00:07:54.159 ************************************ 00:07:54.416 09:12:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:54.416 09:12:03 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:54.416 09:12:03 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:54.416 09:12:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.416 09:12:03 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.416 ************************************ 00:07:54.416 START TEST accel_copy_crc32c_C2 00:07:54.416 ************************************ 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:54.416 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:54.416 [2024-07-15 09:12:03.190036] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:54.416 [2024-07-15 09:12:03.190097] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57768 ] 00:07:54.416 [2024-07-15 09:12:03.318697] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.673 [2024-07-15 09:12:03.420282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.673 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:54.674 09:12:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.046 00:07:56.046 real 0m1.504s 00:07:56.046 user 0m0.012s 00:07:56.046 sys 0m0.001s 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.046 09:12:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:56.046 ************************************ 00:07:56.046 END TEST accel_copy_crc32c_C2 00:07:56.046 ************************************ 00:07:56.046 09:12:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:56.046 09:12:04 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:56.046 09:12:04 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:56.046 09:12:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.046 09:12:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.046 ************************************ 00:07:56.046 START TEST accel_dualcast 00:07:56.046 ************************************ 00:07:56.046 09:12:04 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:56.046 09:12:04 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:56.046 [2024-07-15 09:12:04.758439] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:56.046 [2024-07-15 09:12:04.758499] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid58090 ] 00:07:56.046 [2024-07-15 09:12:04.885033] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.046 [2024-07-15 09:12:04.992199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:56.304 09:12:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:57.679 09:12:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.679 00:07:57.679 real 0m1.498s 00:07:57.679 user 0m0.011s 00:07:57.679 sys 0m0.001s 00:07:57.679 09:12:06 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.679 09:12:06 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:57.679 ************************************ 00:07:57.679 END TEST accel_dualcast 00:07:57.679 ************************************ 00:07:57.679 09:12:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:57.679 09:12:06 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:57.679 09:12:06 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:57.679 09:12:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.679 09:12:06 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.679 ************************************ 00:07:57.679 START TEST accel_compare 00:07:57.679 ************************************ 00:07:57.679 09:12:06 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:57.679 [2024-07-15 09:12:06.339179] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:57.679 [2024-07-15 09:12:06.339239] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid58594 ] 00:07:57.679 [2024-07-15 09:12:06.468216] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.679 [2024-07-15 09:12:06.567278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.679 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:57.680 09:12:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:59.050 09:12:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:59.050 00:07:59.050 real 0m1.471s 00:07:59.050 user 0m0.008s 00:07:59.050 sys 0m0.005s 00:07:59.050 09:12:07 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.050 09:12:07 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:59.050 ************************************ 00:07:59.050 END TEST accel_compare 00:07:59.050 ************************************ 00:07:59.050 09:12:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:59.050 09:12:07 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:59.050 09:12:07 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:59.050 09:12:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.050 09:12:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.050 ************************************ 00:07:59.050 START TEST accel_xor 00:07:59.050 ************************************ 00:07:59.051 09:12:07 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:59.051 09:12:07 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:59.051 [2024-07-15 09:12:07.890519] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:07:59.051 [2024-07-15 09:12:07.890580] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid58898 ] 00:07:59.308 [2024-07-15 09:12:08.020178] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.308 [2024-07-15 09:12:08.120262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.308 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:59.309 09:12:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.713 00:08:00.713 real 0m1.505s 00:08:00.713 user 0m0.008s 00:08:00.713 sys 0m0.004s 00:08:00.713 09:12:09 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.713 09:12:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:00.713 ************************************ 00:08:00.713 END TEST accel_xor 00:08:00.713 ************************************ 00:08:00.713 09:12:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.713 09:12:09 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:00.713 09:12:09 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:00.713 09:12:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.713 09:12:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.713 ************************************ 00:08:00.713 START TEST accel_xor 00:08:00.713 ************************************ 00:08:00.713 09:12:09 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:00.713 09:12:09 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:00.714 09:12:09 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:00.714 [2024-07-15 09:12:09.458034] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:00.714 [2024-07-15 09:12:09.458094] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid59103 ] 00:08:00.714 [2024-07-15 09:12:09.585258] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.972 [2024-07-15 09:12:09.687547] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.972 09:12:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:02.376 09:12:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.376 00:08:02.376 real 0m1.487s 00:08:02.376 user 0m0.012s 00:08:02.376 sys 0m0.000s 00:08:02.376 09:12:10 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.376 09:12:10 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:02.376 ************************************ 00:08:02.376 END TEST accel_xor 00:08:02.376 ************************************ 00:08:02.376 09:12:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.376 09:12:10 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:02.376 09:12:10 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:02.377 09:12:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.377 09:12:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.377 ************************************ 00:08:02.377 START TEST accel_dif_verify 00:08:02.377 ************************************ 00:08:02.377 09:12:10 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.377 09:12:10 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:02.377 09:12:11 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:02.377 [2024-07-15 09:12:11.027994] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:02.377 [2024-07-15 09:12:11.028055] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid59296 ] 00:08:02.377 [2024-07-15 09:12:11.156737] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.377 [2024-07-15 09:12:11.256962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:02.682 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:02.683 09:12:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:03.615 09:12:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.615 00:08:03.615 real 0m1.498s 00:08:03.615 user 0m0.011s 00:08:03.615 sys 0m0.002s 00:08:03.615 09:12:12 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.615 09:12:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:03.615 ************************************ 00:08:03.615 END TEST accel_dif_verify 00:08:03.615 ************************************ 00:08:03.615 09:12:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.615 09:12:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:03.615 09:12:12 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:03.615 09:12:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.615 09:12:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.873 ************************************ 00:08:03.873 START TEST accel_dif_generate 00:08:03.873 ************************************ 00:08:03.873 09:12:12 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:03.873 09:12:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:03.873 [2024-07-15 09:12:12.602139] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:03.873 [2024-07-15 09:12:12.602199] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid59505 ] 00:08:03.873 [2024-07-15 09:12:12.729152] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.131 [2024-07-15 09:12:12.828141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.131 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:04.132 09:12:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.503 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:05.504 09:12:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.504 00:08:05.504 real 0m1.489s 00:08:05.504 user 0m0.013s 00:08:05.504 sys 0m0.000s 00:08:05.504 09:12:14 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.504 09:12:14 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:05.504 ************************************ 00:08:05.504 END TEST accel_dif_generate 00:08:05.504 ************************************ 00:08:05.504 09:12:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.504 09:12:14 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:05.504 09:12:14 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:05.504 09:12:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.504 09:12:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.504 ************************************ 00:08:05.504 START TEST accel_dif_generate_copy 00:08:05.504 ************************************ 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:05.504 [2024-07-15 09:12:14.152664] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:05.504 [2024-07-15 09:12:14.152709] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid59700 ] 00:08:05.504 [2024-07-15 09:12:14.262106] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.504 [2024-07-15 09:12:14.362734] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.504 09:12:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.878 00:08:06.878 real 0m1.469s 00:08:06.878 user 0m0.010s 00:08:06.878 sys 0m0.002s 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.878 09:12:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:06.878 ************************************ 00:08:06.878 END TEST accel_dif_generate_copy 00:08:06.878 ************************************ 00:08:06.878 09:12:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.878 09:12:15 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:06.878 09:12:15 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.878 09:12:15 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:06.878 09:12:15 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.878 09:12:15 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.878 ************************************ 00:08:06.878 START TEST accel_comp 00:08:06.878 ************************************ 00:08:06.878 09:12:15 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:06.878 09:12:15 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:06.878 [2024-07-15 09:12:15.725134] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:06.878 [2024-07-15 09:12:15.725262] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid59905 ] 00:08:07.137 [2024-07-15 09:12:15.924030] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.137 [2024-07-15 09:12:16.030416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:07.395 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 09:12:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:08.330 09:12:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.330 00:08:08.330 real 0m1.598s 00:08:08.330 user 0m0.011s 00:08:08.330 sys 0m0.002s 00:08:08.330 09:12:17 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.330 09:12:17 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:08.330 ************************************ 00:08:08.330 END TEST accel_comp 00:08:08.330 ************************************ 00:08:08.588 09:12:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.588 09:12:17 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.588 09:12:17 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:08.588 09:12:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.588 09:12:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.588 ************************************ 00:08:08.588 START TEST accel_decomp 00:08:08.588 ************************************ 00:08:08.588 09:12:17 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:08.588 09:12:17 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:08.588 [2024-07-15 09:12:17.390370] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:08.588 [2024-07-15 09:12:17.390442] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60245 ] 00:08:08.588 [2024-07-15 09:12:17.520359] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.846 [2024-07-15 09:12:17.628150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.846 09:12:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:10.241 09:12:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.241 00:08:10.241 real 0m1.519s 00:08:10.241 user 0m0.012s 00:08:10.241 sys 0m0.002s 00:08:10.241 09:12:18 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.241 09:12:18 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:10.241 ************************************ 00:08:10.241 END TEST accel_decomp 00:08:10.241 ************************************ 00:08:10.241 09:12:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.241 09:12:18 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:10.241 09:12:18 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:10.241 09:12:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.241 09:12:18 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.241 ************************************ 00:08:10.241 START TEST accel_decomp_full 00:08:10.241 ************************************ 00:08:10.241 09:12:18 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:10.241 09:12:18 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:10.241 [2024-07-15 09:12:18.984290] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:10.241 [2024-07-15 09:12:18.984351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60454 ] 00:08:10.241 [2024-07-15 09:12:19.098204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.499 [2024-07-15 09:12:19.204420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.499 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.500 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:10.500 09:12:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:10.500 09:12:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:10.500 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:10.500 09:12:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.871 09:12:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.872 09:12:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.872 09:12:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.872 00:08:11.872 real 0m1.507s 00:08:11.872 user 0m0.012s 00:08:11.872 sys 0m0.002s 00:08:11.872 09:12:20 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.872 09:12:20 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:11.872 ************************************ 00:08:11.872 END TEST accel_decomp_full 00:08:11.872 ************************************ 00:08:11.872 09:12:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.872 09:12:20 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:11.872 09:12:20 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:11.872 09:12:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.872 09:12:20 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.872 ************************************ 00:08:11.872 START TEST accel_decomp_mcore 00:08:11.872 ************************************ 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:11.872 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:11.872 [2024-07-15 09:12:20.556799] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:11.872 [2024-07-15 09:12:20.556862] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60648 ] 00:08:11.872 [2024-07-15 09:12:20.685458] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:11.872 [2024-07-15 09:12:20.789506] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:11.872 [2024-07-15 09:12:20.789590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:11.872 [2024-07-15 09:12:20.789666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:11.872 [2024-07-15 09:12:20.789670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.130 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.131 09:12:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.503 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:13.504 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:13.504 09:12:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:13.504 00:08:13.504 real 0m1.516s 00:08:13.504 user 0m4.760s 00:08:13.504 sys 0m0.197s 00:08:13.504 09:12:22 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.504 09:12:22 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:13.504 ************************************ 00:08:13.504 END TEST accel_decomp_mcore 00:08:13.504 ************************************ 00:08:13.504 09:12:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:13.504 09:12:22 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:13.504 09:12:22 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:13.504 09:12:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.504 09:12:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.504 ************************************ 00:08:13.504 START TEST accel_decomp_full_mcore 00:08:13.504 ************************************ 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:13.504 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:13.504 [2024-07-15 09:12:22.154108] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:13.504 [2024-07-15 09:12:22.154172] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60850 ] 00:08:13.504 [2024-07-15 09:12:22.287319] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:13.504 [2024-07-15 09:12:22.396077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:13.504 [2024-07-15 09:12:22.396161] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:13.504 [2024-07-15 09:12:22.396242] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:13.504 [2024-07-15 09:12:22.396245] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.763 09:12:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.698 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.957 00:08:14.957 real 0m1.537s 00:08:14.957 user 0m4.806s 00:08:14.957 sys 0m0.209s 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.957 09:12:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:14.957 ************************************ 00:08:14.957 END TEST accel_decomp_full_mcore 00:08:14.957 ************************************ 00:08:14.957 09:12:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:14.957 09:12:23 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:14.957 09:12:23 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:14.957 09:12:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.957 09:12:23 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.957 ************************************ 00:08:14.957 START TEST accel_decomp_mthread 00:08:14.957 ************************************ 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.957 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:14.958 09:12:23 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:14.958 [2024-07-15 09:12:23.774760] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:14.958 [2024-07-15 09:12:23.774822] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid61053 ] 00:08:14.958 [2024-07-15 09:12:23.903395] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.216 [2024-07-15 09:12:24.006571] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.216 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.217 09:12:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:16.589 00:08:16.589 real 0m1.514s 00:08:16.589 user 0m1.336s 00:08:16.589 sys 0m0.186s 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:16.589 09:12:25 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:16.589 ************************************ 00:08:16.589 END TEST accel_decomp_mthread 00:08:16.589 ************************************ 00:08:16.589 09:12:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:16.589 09:12:25 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:16.589 09:12:25 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:16.589 09:12:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.589 09:12:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:16.589 ************************************ 00:08:16.589 START TEST accel_decomp_full_mthread 00:08:16.589 ************************************ 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:16.590 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:16.590 [2024-07-15 09:12:25.359084] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:16.590 [2024-07-15 09:12:25.359145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid61331 ] 00:08:16.590 [2024-07-15 09:12:25.485428] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.849 [2024-07-15 09:12:25.583495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.849 09:12:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:18.225 00:08:18.225 real 0m1.514s 00:08:18.225 user 0m1.334s 00:08:18.225 sys 0m0.185s 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.225 09:12:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:18.225 ************************************ 00:08:18.225 END TEST accel_decomp_full_mthread 00:08:18.225 ************************************ 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:18.225 09:12:26 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:18.225 09:12:26 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:18.225 09:12:26 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:18.225 09:12:26 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:18.225 09:12:26 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=61605 00:08:18.225 09:12:26 accel -- accel/accel.sh@63 -- # waitforlisten 61605 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@829 -- # '[' -z 61605 ']' 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:18.225 09:12:26 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:18.225 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:18.225 09:12:26 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:18.225 09:12:26 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.225 09:12:26 accel -- common/autotest_common.sh@10 -- # set +x 00:08:18.225 09:12:26 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.225 09:12:26 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.225 09:12:26 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.225 09:12:26 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:18.225 09:12:26 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:18.225 09:12:26 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:18.225 09:12:26 accel -- accel/accel.sh@41 -- # jq -r . 00:08:18.225 [2024-07-15 09:12:26.956184] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:18.226 [2024-07-15 09:12:26.956252] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid61605 ] 00:08:18.226 [2024-07-15 09:12:27.081677] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.486 [2024-07-15 09:12:27.178965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.091 [2024-07-15 09:12:27.954546] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:19.350 09:12:28 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:19.350 09:12:28 accel -- common/autotest_common.sh@862 -- # return 0 00:08:19.350 09:12:28 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:19.350 09:12:28 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:19.350 09:12:28 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:19.350 09:12:28 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:19.350 09:12:28 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:19.350 09:12:28 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:19.350 09:12:28 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:19.350 09:12:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:19.350 09:12:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.350 09:12:28 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:19.608 09:12:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:19.608 "method": "compressdev_scan_accel_module", 00:08:19.608 09:12:28 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:19.608 09:12:28 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:19.608 09:12:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:19.608 09:12:28 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:19.608 09:12:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.608 09:12:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.608 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.608 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.608 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.608 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.608 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.608 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.608 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:19.609 09:12:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:19.609 09:12:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:19.609 09:12:28 accel -- accel/accel.sh@75 -- # killprocess 61605 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@948 -- # '[' -z 61605 ']' 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@952 -- # kill -0 61605 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@953 -- # uname 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 61605 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 61605' 00:08:19.609 killing process with pid 61605 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@967 -- # kill 61605 00:08:19.609 09:12:28 accel -- common/autotest_common.sh@972 -- # wait 61605 00:08:20.175 09:12:28 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:20.175 09:12:28 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.175 09:12:28 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:20.175 09:12:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.175 09:12:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.175 ************************************ 00:08:20.175 START TEST accel_cdev_comp 00:08:20.175 ************************************ 00:08:20.175 09:12:28 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:20.175 09:12:28 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:20.175 [2024-07-15 09:12:28.897221] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:20.175 [2024-07-15 09:12:28.897283] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid61809 ] 00:08:20.175 [2024-07-15 09:12:29.009638] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.175 [2024-07-15 09:12:29.110598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.109 [2024-07-15 09:12:29.878458] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:21.109 [2024-07-15 09:12:29.881063] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe20080 PMD being used: compress_qat 00:08:21.109 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.109 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.109 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.109 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 [2024-07-15 09:12:29.885169] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe24e60 PMD being used: compress_qat 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.110 09:12:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:22.485 09:12:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:22.485 00:08:22.485 real 0m2.199s 00:08:22.485 user 0m1.630s 00:08:22.485 sys 0m0.572s 00:08:22.485 09:12:31 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:22.485 09:12:31 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:22.485 ************************************ 00:08:22.485 END TEST accel_cdev_comp 00:08:22.485 ************************************ 00:08:22.485 09:12:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:22.486 09:12:31 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:22.486 09:12:31 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:22.486 09:12:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.486 09:12:31 accel -- common/autotest_common.sh@10 -- # set +x 00:08:22.486 ************************************ 00:08:22.486 START TEST accel_cdev_decomp 00:08:22.486 ************************************ 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:22.486 09:12:31 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:22.486 [2024-07-15 09:12:31.177859] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:22.486 [2024-07-15 09:12:31.177924] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid62174 ] 00:08:22.486 [2024-07-15 09:12:31.306730] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.486 [2024-07-15 09:12:31.407029] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.419 [2024-07-15 09:12:32.182524] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:23.419 [2024-07-15 09:12:32.185046] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1327080 PMD being used: compress_qat 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 [2024-07-15 09:12:32.189243] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x132be60 PMD being used: compress_qat 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.419 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.420 09:12:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:24.797 00:08:24.797 real 0m2.225s 00:08:24.797 user 0m1.636s 00:08:24.797 sys 0m0.589s 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.797 09:12:33 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:24.797 ************************************ 00:08:24.797 END TEST accel_cdev_decomp 00:08:24.797 ************************************ 00:08:24.797 09:12:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:24.797 09:12:33 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:24.797 09:12:33 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:24.797 09:12:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.797 09:12:33 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.797 ************************************ 00:08:24.797 START TEST accel_cdev_decomp_full 00:08:24.797 ************************************ 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:24.797 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:24.798 09:12:33 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:24.798 [2024-07-15 09:12:33.486089] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:24.798 [2024-07-15 09:12:33.486151] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid62531 ] 00:08:24.798 [2024-07-15 09:12:33.614900] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.798 [2024-07-15 09:12:33.714196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.737 [2024-07-15 09:12:34.479394] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:25.737 [2024-07-15 09:12:34.482196] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b8b080 PMD being used: compress_qat 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 [2024-07-15 09:12:34.485547] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b8ace0 PMD being used: compress_qat 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.737 09:12:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:27.111 00:08:27.111 real 0m2.217s 00:08:27.111 user 0m1.649s 00:08:27.111 sys 0m0.574s 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:27.111 09:12:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:27.111 ************************************ 00:08:27.111 END TEST accel_cdev_decomp_full 00:08:27.111 ************************************ 00:08:27.111 09:12:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:27.111 09:12:35 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:27.111 09:12:35 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:27.111 09:12:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.111 09:12:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.111 ************************************ 00:08:27.111 START TEST accel_cdev_decomp_mcore 00:08:27.111 ************************************ 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:27.111 09:12:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:27.111 [2024-07-15 09:12:35.787073] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:27.111 [2024-07-15 09:12:35.787143] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid62738 ] 00:08:27.111 [2024-07-15 09:12:35.917877] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:27.111 [2024-07-15 09:12:36.021954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:27.111 [2024-07-15 09:12:36.021998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:27.111 [2024-07-15 09:12:36.022075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:27.111 [2024-07-15 09:12:36.022078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.042 [2024-07-15 09:12:36.778046] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:28.042 [2024-07-15 09:12:36.780653] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c5e720 PMD being used: compress_qat 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.042 [2024-07-15 09:12:36.786287] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc12819b8b0 PMD being used: compress_qat 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.042 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 [2024-07-15 09:12:36.787033] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc12019b8b0 PMD being used: compress_qat 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 [2024-07-15 09:12:36.788163] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c639f0 PMD being used: compress_qat 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.043 [2024-07-15 09:12:36.788340] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc11819b8b0 PMD being used: compress_qat 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.043 09:12:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:29.416 00:08:29.416 real 0m2.234s 00:08:29.416 user 0m7.219s 00:08:29.416 sys 0m0.592s 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.416 09:12:37 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:29.416 ************************************ 00:08:29.416 END TEST accel_cdev_decomp_mcore 00:08:29.417 ************************************ 00:08:29.417 09:12:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:29.417 09:12:38 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:29.417 09:12:38 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:29.417 09:12:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.417 09:12:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.417 ************************************ 00:08:29.417 START TEST accel_cdev_decomp_full_mcore 00:08:29.417 ************************************ 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:29.417 09:12:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:29.417 [2024-07-15 09:12:38.106637] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:29.417 [2024-07-15 09:12:38.106701] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid63111 ] 00:08:29.417 [2024-07-15 09:12:38.236812] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:29.417 [2024-07-15 09:12:38.341466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.417 [2024-07-15 09:12:38.341552] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:29.417 [2024-07-15 09:12:38.341629] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:29.417 [2024-07-15 09:12:38.341632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.354 [2024-07-15 09:12:39.108554] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:30.354 [2024-07-15 09:12:39.111169] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb13720 PMD being used: compress_qat 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:30.354 [2024-07-15 09:12:39.115980] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f16a019b8b0 PMD being used: compress_qat 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 [2024-07-15 09:12:39.116646] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f169819b8b0 PMD being used: compress_qat 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 [2024-07-15 09:12:39.117783] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb16a30 PMD being used: compress_qat 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 [2024-07-15 09:12:39.118026] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f169019b8b0 PMD being used: compress_qat 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.354 09:12:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.728 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:31.729 00:08:31.729 real 0m2.232s 00:08:31.729 user 0m7.201s 00:08:31.729 sys 0m0.601s 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.729 09:12:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:31.729 ************************************ 00:08:31.729 END TEST accel_cdev_decomp_full_mcore 00:08:31.729 ************************************ 00:08:31.729 09:12:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:31.729 09:12:40 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:31.729 09:12:40 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:31.729 09:12:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.729 09:12:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.729 ************************************ 00:08:31.729 START TEST accel_cdev_decomp_mthread 00:08:31.729 ************************************ 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:31.729 09:12:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:31.729 [2024-07-15 09:12:40.421537] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:31.729 [2024-07-15 09:12:40.421601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid63478 ] 00:08:31.729 [2024-07-15 09:12:40.551116] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.729 [2024-07-15 09:12:40.655429] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.681 [2024-07-15 09:12:41.427314] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:32.681 [2024-07-15 09:12:41.429963] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dce080 PMD being used: compress_qat 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 [2024-07-15 09:12:41.435032] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dd32a0 PMD being used: compress_qat 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 [2024-07-15 09:12:41.437587] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ef60f0 PMD being used: compress_qat 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.681 09:12:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:34.057 00:08:34.057 real 0m2.237s 00:08:34.057 user 0m1.659s 00:08:34.057 sys 0m0.580s 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.057 09:12:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:34.057 ************************************ 00:08:34.057 END TEST accel_cdev_decomp_mthread 00:08:34.057 ************************************ 00:08:34.057 09:12:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:34.057 09:12:42 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.057 09:12:42 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:34.057 09:12:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.057 09:12:42 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.057 ************************************ 00:08:34.057 START TEST accel_cdev_decomp_full_mthread 00:08:34.057 ************************************ 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:34.057 09:12:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:34.057 [2024-07-15 09:12:42.737400] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:34.057 [2024-07-15 09:12:42.737462] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid63750 ] 00:08:34.057 [2024-07-15 09:12:42.865481] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.057 [2024-07-15 09:12:42.962645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.991 [2024-07-15 09:12:43.731138] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:34.991 [2024-07-15 09:12:43.733709] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b5c080 PMD being used: compress_qat 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.991 [2024-07-15 09:12:43.737965] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b5f3b0 PMD being used: compress_qat 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.991 [2024-07-15 09:12:43.740881] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c83cc0 PMD being used: compress_qat 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:34.991 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.992 09:12:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.451 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:36.452 00:08:36.452 real 0m2.222s 00:08:36.452 user 0m1.631s 00:08:36.452 sys 0m0.594s 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.452 09:12:44 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:36.452 ************************************ 00:08:36.452 END TEST accel_cdev_decomp_full_mthread 00:08:36.452 ************************************ 00:08:36.452 09:12:44 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.452 09:12:44 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:36.452 09:12:44 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:36.452 09:12:44 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:36.452 09:12:44 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:36.452 09:12:44 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.452 09:12:44 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.452 09:12:44 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.452 09:12:44 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.452 09:12:44 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.452 09:12:44 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.452 09:12:44 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:36.452 09:12:44 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:36.452 09:12:44 accel -- accel/accel.sh@41 -- # jq -r . 00:08:36.452 ************************************ 00:08:36.452 START TEST accel_dif_functional_tests 00:08:36.452 ************************************ 00:08:36.452 09:12:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:36.452 [2024-07-15 09:12:45.072137] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:36.452 [2024-07-15 09:12:45.072199] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid64043 ] 00:08:36.452 [2024-07-15 09:12:45.200212] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:36.452 [2024-07-15 09:12:45.303350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:36.452 [2024-07-15 09:12:45.303436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:36.452 [2024-07-15 09:12:45.303441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.710 00:08:36.710 00:08:36.710 CUnit - A unit testing framework for C - Version 2.1-3 00:08:36.710 http://cunit.sourceforge.net/ 00:08:36.710 00:08:36.710 00:08:36.710 Suite: accel_dif 00:08:36.710 Test: verify: DIF generated, GUARD check ...passed 00:08:36.710 Test: verify: DIF generated, APPTAG check ...passed 00:08:36.710 Test: verify: DIF generated, REFTAG check ...passed 00:08:36.710 Test: verify: DIF not generated, GUARD check ...[2024-07-15 09:12:45.402597] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:36.710 passed 00:08:36.711 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 09:12:45.402661] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:36.711 passed 00:08:36.711 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 09:12:45.402694] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:36.711 passed 00:08:36.711 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:36.711 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 09:12:45.402765] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:36.711 passed 00:08:36.711 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:36.711 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:36.711 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:36.711 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 09:12:45.402920] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:36.711 passed 00:08:36.711 Test: verify copy: DIF generated, GUARD check ...passed 00:08:36.711 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:36.711 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:36.711 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 09:12:45.403092] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:36.711 passed 00:08:36.711 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 09:12:45.403129] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:36.711 passed 00:08:36.711 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 09:12:45.403163] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:36.711 passed 00:08:36.711 Test: generate copy: DIF generated, GUARD check ...passed 00:08:36.711 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:36.711 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:36.711 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:36.711 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:36.711 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:36.711 Test: generate copy: iovecs-len validate ...[2024-07-15 09:12:45.403406] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:36.711 passed 00:08:36.711 Test: generate copy: buffer alignment validate ...passed 00:08:36.711 00:08:36.711 Run Summary: Type Total Ran Passed Failed Inactive 00:08:36.711 suites 1 1 n/a 0 0 00:08:36.711 tests 26 26 26 0 0 00:08:36.711 asserts 115 115 115 0 n/a 00:08:36.711 00:08:36.711 Elapsed time = 0.003 seconds 00:08:36.711 00:08:36.711 real 0m0.594s 00:08:36.711 user 0m0.779s 00:08:36.711 sys 0m0.229s 00:08:36.711 09:12:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.711 09:12:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:36.711 ************************************ 00:08:36.711 END TEST accel_dif_functional_tests 00:08:36.711 ************************************ 00:08:36.711 09:12:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.711 00:08:36.711 real 0m53.385s 00:08:36.711 user 1m1.433s 00:08:36.711 sys 0m11.832s 00:08:36.711 09:12:45 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.711 09:12:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.711 ************************************ 00:08:36.711 END TEST accel 00:08:36.711 ************************************ 00:08:36.970 09:12:45 -- common/autotest_common.sh@1142 -- # return 0 00:08:36.970 09:12:45 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:36.970 09:12:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:36.970 09:12:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.970 09:12:45 -- common/autotest_common.sh@10 -- # set +x 00:08:36.970 ************************************ 00:08:36.970 START TEST accel_rpc 00:08:36.970 ************************************ 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:36.970 * Looking for test storage... 00:08:36.970 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:36.970 09:12:45 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:36.970 09:12:45 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=64279 00:08:36.970 09:12:45 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:36.970 09:12:45 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 64279 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 64279 ']' 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:36.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:36.970 09:12:45 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:36.970 [2024-07-15 09:12:45.909933] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:36.970 [2024-07-15 09:12:45.910007] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid64279 ] 00:08:37.228 [2024-07-15 09:12:46.038195] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.228 [2024-07-15 09:12:46.137061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.163 09:12:46 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.163 09:12:46 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:38.163 09:12:46 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:38.163 09:12:46 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:38.163 09:12:46 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:38.163 09:12:46 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:38.163 09:12:46 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:38.163 09:12:46 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:38.163 09:12:46 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.163 09:12:46 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 ************************************ 00:08:38.163 START TEST accel_assign_opcode 00:08:38.163 ************************************ 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 [2024-07-15 09:12:46.859378] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 [2024-07-15 09:12:46.871405] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 09:12:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:38.163 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:38.421 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.421 software 00:08:38.421 00:08:38.421 real 0m0.301s 00:08:38.421 user 0m0.050s 00:08:38.421 sys 0m0.014s 00:08:38.421 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:38.421 09:12:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:38.421 ************************************ 00:08:38.421 END TEST accel_assign_opcode 00:08:38.421 ************************************ 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:38.421 09:12:47 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 64279 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 64279 ']' 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 64279 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 64279 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 64279' 00:08:38.421 killing process with pid 64279 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@967 -- # kill 64279 00:08:38.421 09:12:47 accel_rpc -- common/autotest_common.sh@972 -- # wait 64279 00:08:38.989 00:08:38.989 real 0m1.898s 00:08:38.989 user 0m1.952s 00:08:38.989 sys 0m0.589s 00:08:38.989 09:12:47 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:38.989 09:12:47 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:38.989 ************************************ 00:08:38.989 END TEST accel_rpc 00:08:38.989 ************************************ 00:08:38.989 09:12:47 -- common/autotest_common.sh@1142 -- # return 0 00:08:38.989 09:12:47 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:38.989 09:12:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:38.989 09:12:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.989 09:12:47 -- common/autotest_common.sh@10 -- # set +x 00:08:38.989 ************************************ 00:08:38.989 START TEST app_cmdline 00:08:38.989 ************************************ 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:38.989 * Looking for test storage... 00:08:38.989 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:38.989 09:12:47 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:38.989 09:12:47 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=64535 00:08:38.989 09:12:47 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 64535 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 64535 ']' 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:38.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:38.989 09:12:47 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:38.989 09:12:47 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:38.989 [2024-07-15 09:12:47.901836] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:38.989 [2024-07-15 09:12:47.901909] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid64535 ] 00:08:39.248 [2024-07-15 09:12:48.032773] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.248 [2024-07-15 09:12:48.135724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.182 09:12:48 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:40.182 09:12:48 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:40.182 { 00:08:40.182 "version": "SPDK v24.09-pre git sha1 4835eb82b", 00:08:40.182 "fields": { 00:08:40.182 "major": 24, 00:08:40.182 "minor": 9, 00:08:40.182 "patch": 0, 00:08:40.182 "suffix": "-pre", 00:08:40.182 "commit": "4835eb82b" 00:08:40.182 } 00:08:40.182 } 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:40.182 09:12:48 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.182 09:12:48 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:40.182 09:12:48 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:40.182 09:12:48 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.182 09:12:49 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:40.182 09:12:49 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:40.182 09:12:49 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:40.182 09:12:49 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:40.441 request: 00:08:40.441 { 00:08:40.441 "method": "env_dpdk_get_mem_stats", 00:08:40.441 "req_id": 1 00:08:40.441 } 00:08:40.441 Got JSON-RPC error response 00:08:40.441 response: 00:08:40.441 { 00:08:40.441 "code": -32601, 00:08:40.441 "message": "Method not found" 00:08:40.441 } 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:40.441 09:12:49 app_cmdline -- app/cmdline.sh@1 -- # killprocess 64535 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 64535 ']' 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 64535 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 64535 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 64535' 00:08:40.441 killing process with pid 64535 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@967 -- # kill 64535 00:08:40.441 09:12:49 app_cmdline -- common/autotest_common.sh@972 -- # wait 64535 00:08:41.007 00:08:41.007 real 0m1.937s 00:08:41.007 user 0m2.288s 00:08:41.007 sys 0m0.576s 00:08:41.007 09:12:49 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.007 09:12:49 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:41.007 ************************************ 00:08:41.007 END TEST app_cmdline 00:08:41.007 ************************************ 00:08:41.007 09:12:49 -- common/autotest_common.sh@1142 -- # return 0 00:08:41.007 09:12:49 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:41.007 09:12:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:41.007 09:12:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.007 09:12:49 -- common/autotest_common.sh@10 -- # set +x 00:08:41.007 ************************************ 00:08:41.007 START TEST version 00:08:41.007 ************************************ 00:08:41.007 09:12:49 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:41.007 * Looking for test storage... 00:08:41.007 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:41.007 09:12:49 version -- app/version.sh@17 -- # get_header_version major 00:08:41.007 09:12:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # cut -f2 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:41.007 09:12:49 version -- app/version.sh@17 -- # major=24 00:08:41.007 09:12:49 version -- app/version.sh@18 -- # get_header_version minor 00:08:41.007 09:12:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # cut -f2 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:41.007 09:12:49 version -- app/version.sh@18 -- # minor=9 00:08:41.007 09:12:49 version -- app/version.sh@19 -- # get_header_version patch 00:08:41.007 09:12:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # cut -f2 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:41.007 09:12:49 version -- app/version.sh@19 -- # patch=0 00:08:41.007 09:12:49 version -- app/version.sh@20 -- # get_header_version suffix 00:08:41.007 09:12:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # cut -f2 00:08:41.007 09:12:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:41.007 09:12:49 version -- app/version.sh@20 -- # suffix=-pre 00:08:41.007 09:12:49 version -- app/version.sh@22 -- # version=24.9 00:08:41.007 09:12:49 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:41.007 09:12:49 version -- app/version.sh@28 -- # version=24.9rc0 00:08:41.007 09:12:49 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:41.007 09:12:49 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:41.007 09:12:49 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:41.007 09:12:49 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:41.007 00:08:41.007 real 0m0.187s 00:08:41.007 user 0m0.088s 00:08:41.007 sys 0m0.144s 00:08:41.007 09:12:49 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.007 09:12:49 version -- common/autotest_common.sh@10 -- # set +x 00:08:41.007 ************************************ 00:08:41.007 END TEST version 00:08:41.007 ************************************ 00:08:41.266 09:12:49 -- common/autotest_common.sh@1142 -- # return 0 00:08:41.266 09:12:49 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:41.266 09:12:49 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:41.266 09:12:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:41.266 09:12:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.266 09:12:49 -- common/autotest_common.sh@10 -- # set +x 00:08:41.266 ************************************ 00:08:41.266 START TEST blockdev_general 00:08:41.266 ************************************ 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:41.266 * Looking for test storage... 00:08:41.266 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:41.266 09:12:50 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=65003 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 65003 00:08:41.266 09:12:50 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 65003 ']' 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:41.266 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:41.266 09:12:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:41.266 [2024-07-15 09:12:50.200183] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:41.266 [2024-07-15 09:12:50.200257] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid65003 ] 00:08:41.524 [2024-07-15 09:12:50.327275] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.524 [2024-07-15 09:12:50.424057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.459 09:12:51 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:42.459 09:12:51 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:42.459 09:12:51 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:42.459 09:12:51 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:42.459 09:12:51 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:42.459 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.459 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.459 [2024-07-15 09:12:51.369750] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:42.459 [2024-07-15 09:12:51.369810] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:42.459 00:08:42.459 [2024-07-15 09:12:51.377733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:42.459 [2024-07-15 09:12:51.377762] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:42.459 00:08:42.459 Malloc0 00:08:42.459 Malloc1 00:08:42.718 Malloc2 00:08:42.718 Malloc3 00:08:42.718 Malloc4 00:08:42.718 Malloc5 00:08:42.718 Malloc6 00:08:42.718 Malloc7 00:08:42.718 Malloc8 00:08:42.718 Malloc9 00:08:42.718 [2024-07-15 09:12:51.526501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:42.718 [2024-07-15 09:12:51.526552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:42.718 [2024-07-15 09:12:51.526573] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e8350 00:08:42.718 [2024-07-15 09:12:51.526586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:42.718 [2024-07-15 09:12:51.527988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:42.718 [2024-07-15 09:12:51.528019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:42.718 TestPT 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:42.718 5000+0 records in 00:08:42.718 5000+0 records out 00:08:42.718 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0283651 s, 361 MB/s 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.718 AIO0 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.718 09:12:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.718 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.976 09:12:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.976 09:12:51 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:42.976 09:12:51 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:42.976 09:12:51 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.976 09:12:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.236 09:12:51 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:43.236 09:12:51 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:43.237 09:12:51 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "9d434293-4838-4844-8a94-ff05f239b5cf"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d434293-4838-4844-8a94-ff05f239b5cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "7f9948b0-3712-54e2-8746-4caed203ad9a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7f9948b0-3712-54e2-8746-4caed203ad9a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e4a2d656-8e4d-59e6-a010-87f03c3fe05d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e4a2d656-8e4d-59e6-a010-87f03c3fe05d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "01f8811e-da90-514d-a368-ca6c62342984"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "01f8811e-da90-514d-a368-ca6c62342984",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "52b553cf-6865-5fec-a1bb-8577821c1d71"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52b553cf-6865-5fec-a1bb-8577821c1d71",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4dbadad4-929c-5842-8875-98548860095d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4dbadad4-929c-5842-8875-98548860095d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d230eb87-cbba-5020-8d68-b2df9505c49c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d230eb87-cbba-5020-8d68-b2df9505c49c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "3185c29a-e08b-5ef3-8361-c01c142053da"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3185c29a-e08b-5ef3-8361-c01c142053da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "09d616e4-d884-5312-8f07-75409a4fa3cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09d616e4-d884-5312-8f07-75409a4fa3cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "38c52917-3a44-5583-8ea8-00269e3e9289"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "38c52917-3a44-5583-8ea8-00269e3e9289",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "67e2ba06-5292-590a-bfb2-77e5e39162ee"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e2ba06-5292-590a-bfb2-77e5e39162ee",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4f0ff21-2901-5825-8f0e-8204bbde143e"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4f0ff21-2901-5825-8f0e-8204bbde143e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "428cfc74-3bf1-4a4b-a3bc-2a2fed04e202",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f8f8c968-bfe5-4a1e-a0c0-98869704e9b7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f0efb1ce-4bd3-47a3-8863-1d70b87e5763"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "3fc94c3f-7d00-4841-845d-13d9683244d3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a0403874-7e7e-45ee-b974-b43b5f23ee26",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "879c81c3-edbc-4722-ad4c-6e584e2a6572"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d381df3f-20bc-4158-b14b-5813c0a8c4c3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "2c15ca4b-eadd-4af1-9c5c-c55127866c92",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:43.237 09:12:51 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:43.237 09:12:51 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:43.237 09:12:51 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:43.237 09:12:51 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 65003 00:08:43.237 09:12:51 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 65003 ']' 00:08:43.237 09:12:51 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 65003 00:08:43.237 09:12:51 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:43.237 09:12:51 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 65003 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 65003' 00:08:43.237 killing process with pid 65003 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@967 -- # kill 65003 00:08:43.237 09:12:52 blockdev_general -- common/autotest_common.sh@972 -- # wait 65003 00:08:43.803 09:12:52 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:43.803 09:12:52 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:43.803 09:12:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:43.803 09:12:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.803 09:12:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:43.803 ************************************ 00:08:43.803 START TEST bdev_hello_world 00:08:43.803 ************************************ 00:08:43.803 09:12:52 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:43.803 [2024-07-15 09:12:52.609678] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:43.803 [2024-07-15 09:12:52.609741] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid65378 ] 00:08:43.803 [2024-07-15 09:12:52.737527] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.061 [2024-07-15 09:12:52.839521] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.061 [2024-07-15 09:12:53.000593] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:44.061 [2024-07-15 09:12:53.000651] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:44.061 [2024-07-15 09:12:53.000665] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:44.061 [2024-07-15 09:12:53.008595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:44.061 [2024-07-15 09:12:53.008624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:44.319 [2024-07-15 09:12:53.016606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:44.319 [2024-07-15 09:12:53.016631] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:44.319 [2024-07-15 09:12:53.093692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:44.319 [2024-07-15 09:12:53.093744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:44.319 [2024-07-15 09:12:53.093763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15853c0 00:08:44.319 [2024-07-15 09:12:53.093776] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:44.319 [2024-07-15 09:12:53.095223] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:44.319 [2024-07-15 09:12:53.095253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:44.320 [2024-07-15 09:12:53.235384] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:44.320 [2024-07-15 09:12:53.235454] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:44.320 [2024-07-15 09:12:53.235509] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:44.320 [2024-07-15 09:12:53.235586] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:44.320 [2024-07-15 09:12:53.235662] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:44.320 [2024-07-15 09:12:53.235695] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:44.320 [2024-07-15 09:12:53.235758] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:44.320 00:08:44.320 [2024-07-15 09:12:53.235799] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:44.885 00:08:44.885 real 0m1.030s 00:08:44.885 user 0m0.682s 00:08:44.885 sys 0m0.316s 00:08:44.885 09:12:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.885 09:12:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:44.885 ************************************ 00:08:44.885 END TEST bdev_hello_world 00:08:44.885 ************************************ 00:08:44.885 09:12:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:44.885 09:12:53 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:44.885 09:12:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:44.885 09:12:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.885 09:12:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.885 ************************************ 00:08:44.885 START TEST bdev_bounds 00:08:44.885 ************************************ 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=65570 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 65570' 00:08:44.885 Process bdevio pid: 65570 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 65570 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 65570 ']' 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:44.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:44.885 09:12:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:44.885 [2024-07-15 09:12:53.710941] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:44.885 [2024-07-15 09:12:53.711010] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid65570 ] 00:08:44.885 [2024-07-15 09:12:53.830268] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:45.142 [2024-07-15 09:12:53.939440] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:45.142 [2024-07-15 09:12:53.939525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:45.142 [2024-07-15 09:12:53.939530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.400 [2024-07-15 09:12:54.096662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:45.400 [2024-07-15 09:12:54.096726] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:45.400 [2024-07-15 09:12:54.096742] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:45.400 [2024-07-15 09:12:54.104674] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:45.400 [2024-07-15 09:12:54.104706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:45.400 [2024-07-15 09:12:54.112689] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:45.400 [2024-07-15 09:12:54.112714] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:45.400 [2024-07-15 09:12:54.189953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:45.400 [2024-07-15 09:12:54.190004] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:45.400 [2024-07-15 09:12:54.190023] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26cd0c0 00:08:45.400 [2024-07-15 09:12:54.190035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:45.400 [2024-07-15 09:12:54.191492] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:45.400 [2024-07-15 09:12:54.191520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:45.966 09:12:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:45.966 09:12:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:45.966 09:12:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:45.966 I/O targets: 00:08:45.966 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:45.966 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:45.966 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:45.966 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:45.966 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:45.966 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:45.966 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:45.966 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:45.966 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:45.966 00:08:45.966 00:08:45.966 CUnit - A unit testing framework for C - Version 2.1-3 00:08:45.966 http://cunit.sourceforge.net/ 00:08:45.966 00:08:45.966 00:08:45.966 Suite: bdevio tests on: AIO0 00:08:45.966 Test: blockdev write read block ...passed 00:08:45.966 Test: blockdev write zeroes read block ...passed 00:08:45.966 Test: blockdev write zeroes read no split ...passed 00:08:45.966 Test: blockdev write zeroes read split ...passed 00:08:45.966 Test: blockdev write zeroes read split partial ...passed 00:08:45.966 Test: blockdev reset ...passed 00:08:45.966 Test: blockdev write read 8 blocks ...passed 00:08:45.966 Test: blockdev write read size > 128k ...passed 00:08:45.966 Test: blockdev write read invalid size ...passed 00:08:45.966 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.966 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.966 Test: blockdev write read max offset ...passed 00:08:45.966 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.966 Test: blockdev writev readv 8 blocks ...passed 00:08:45.966 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.966 Test: blockdev writev readv block ...passed 00:08:45.966 Test: blockdev writev readv size > 128k ...passed 00:08:45.966 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.966 Test: blockdev comparev and writev ...passed 00:08:45.966 Test: blockdev nvme passthru rw ...passed 00:08:45.966 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.966 Test: blockdev nvme admin passthru ...passed 00:08:45.966 Test: blockdev copy ...passed 00:08:45.966 Suite: bdevio tests on: raid1 00:08:45.966 Test: blockdev write read block ...passed 00:08:45.966 Test: blockdev write zeroes read block ...passed 00:08:45.966 Test: blockdev write zeroes read no split ...passed 00:08:45.966 Test: blockdev write zeroes read split ...passed 00:08:45.966 Test: blockdev write zeroes read split partial ...passed 00:08:45.966 Test: blockdev reset ...passed 00:08:45.966 Test: blockdev write read 8 blocks ...passed 00:08:45.966 Test: blockdev write read size > 128k ...passed 00:08:45.966 Test: blockdev write read invalid size ...passed 00:08:45.966 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.966 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.966 Test: blockdev write read max offset ...passed 00:08:45.966 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.966 Test: blockdev writev readv 8 blocks ...passed 00:08:45.966 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.966 Test: blockdev writev readv block ...passed 00:08:45.966 Test: blockdev writev readv size > 128k ...passed 00:08:45.966 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.966 Test: blockdev comparev and writev ...passed 00:08:45.966 Test: blockdev nvme passthru rw ...passed 00:08:45.966 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.966 Test: blockdev nvme admin passthru ...passed 00:08:45.966 Test: blockdev copy ...passed 00:08:45.966 Suite: bdevio tests on: concat0 00:08:45.966 Test: blockdev write read block ...passed 00:08:45.966 Test: blockdev write zeroes read block ...passed 00:08:45.966 Test: blockdev write zeroes read no split ...passed 00:08:45.966 Test: blockdev write zeroes read split ...passed 00:08:45.966 Test: blockdev write zeroes read split partial ...passed 00:08:45.966 Test: blockdev reset ...passed 00:08:45.966 Test: blockdev write read 8 blocks ...passed 00:08:45.966 Test: blockdev write read size > 128k ...passed 00:08:45.966 Test: blockdev write read invalid size ...passed 00:08:45.966 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.966 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.966 Test: blockdev write read max offset ...passed 00:08:45.966 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.966 Test: blockdev writev readv 8 blocks ...passed 00:08:45.966 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.966 Test: blockdev writev readv block ...passed 00:08:45.966 Test: blockdev writev readv size > 128k ...passed 00:08:45.966 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.966 Test: blockdev comparev and writev ...passed 00:08:45.966 Test: blockdev nvme passthru rw ...passed 00:08:45.966 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.966 Test: blockdev nvme admin passthru ...passed 00:08:45.966 Test: blockdev copy ...passed 00:08:45.966 Suite: bdevio tests on: raid0 00:08:45.966 Test: blockdev write read block ...passed 00:08:45.966 Test: blockdev write zeroes read block ...passed 00:08:45.966 Test: blockdev write zeroes read no split ...passed 00:08:45.966 Test: blockdev write zeroes read split ...passed 00:08:45.966 Test: blockdev write zeroes read split partial ...passed 00:08:45.966 Test: blockdev reset ...passed 00:08:45.966 Test: blockdev write read 8 blocks ...passed 00:08:45.966 Test: blockdev write read size > 128k ...passed 00:08:45.966 Test: blockdev write read invalid size ...passed 00:08:45.966 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.966 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.966 Test: blockdev write read max offset ...passed 00:08:45.966 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.966 Test: blockdev writev readv 8 blocks ...passed 00:08:45.966 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.966 Test: blockdev writev readv block ...passed 00:08:45.966 Test: blockdev writev readv size > 128k ...passed 00:08:45.966 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.966 Test: blockdev comparev and writev ...passed 00:08:45.966 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: TestPT 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:45.967 Test: blockdev write zeroes read split ...passed 00:08:45.967 Test: blockdev write zeroes read split partial ...passed 00:08:45.967 Test: blockdev reset ...passed 00:08:45.967 Test: blockdev write read 8 blocks ...passed 00:08:45.967 Test: blockdev write read size > 128k ...passed 00:08:45.967 Test: blockdev write read invalid size ...passed 00:08:45.967 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.967 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.967 Test: blockdev write read max offset ...passed 00:08:45.967 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.967 Test: blockdev writev readv 8 blocks ...passed 00:08:45.967 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.967 Test: blockdev writev readv block ...passed 00:08:45.967 Test: blockdev writev readv size > 128k ...passed 00:08:45.967 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.967 Test: blockdev comparev and writev ...passed 00:08:45.967 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: Malloc2p7 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:45.967 Test: blockdev write zeroes read split ...passed 00:08:45.967 Test: blockdev write zeroes read split partial ...passed 00:08:45.967 Test: blockdev reset ...passed 00:08:45.967 Test: blockdev write read 8 blocks ...passed 00:08:45.967 Test: blockdev write read size > 128k ...passed 00:08:45.967 Test: blockdev write read invalid size ...passed 00:08:45.967 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.967 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.967 Test: blockdev write read max offset ...passed 00:08:45.967 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.967 Test: blockdev writev readv 8 blocks ...passed 00:08:45.967 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.967 Test: blockdev writev readv block ...passed 00:08:45.967 Test: blockdev writev readv size > 128k ...passed 00:08:45.967 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.967 Test: blockdev comparev and writev ...passed 00:08:45.967 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: Malloc2p6 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:45.967 Test: blockdev write zeroes read split ...passed 00:08:45.967 Test: blockdev write zeroes read split partial ...passed 00:08:45.967 Test: blockdev reset ...passed 00:08:45.967 Test: blockdev write read 8 blocks ...passed 00:08:45.967 Test: blockdev write read size > 128k ...passed 00:08:45.967 Test: blockdev write read invalid size ...passed 00:08:45.967 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.967 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.967 Test: blockdev write read max offset ...passed 00:08:45.967 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.967 Test: blockdev writev readv 8 blocks ...passed 00:08:45.967 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.967 Test: blockdev writev readv block ...passed 00:08:45.967 Test: blockdev writev readv size > 128k ...passed 00:08:45.967 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.967 Test: blockdev comparev and writev ...passed 00:08:45.967 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: Malloc2p5 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:45.967 Test: blockdev write zeroes read split ...passed 00:08:45.967 Test: blockdev write zeroes read split partial ...passed 00:08:45.967 Test: blockdev reset ...passed 00:08:45.967 Test: blockdev write read 8 blocks ...passed 00:08:45.967 Test: blockdev write read size > 128k ...passed 00:08:45.967 Test: blockdev write read invalid size ...passed 00:08:45.967 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.967 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.967 Test: blockdev write read max offset ...passed 00:08:45.967 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.967 Test: blockdev writev readv 8 blocks ...passed 00:08:45.967 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.967 Test: blockdev writev readv block ...passed 00:08:45.967 Test: blockdev writev readv size > 128k ...passed 00:08:45.967 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.967 Test: blockdev comparev and writev ...passed 00:08:45.967 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: Malloc2p4 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:45.967 Test: blockdev write zeroes read split ...passed 00:08:45.967 Test: blockdev write zeroes read split partial ...passed 00:08:45.967 Test: blockdev reset ...passed 00:08:45.967 Test: blockdev write read 8 blocks ...passed 00:08:45.967 Test: blockdev write read size > 128k ...passed 00:08:45.967 Test: blockdev write read invalid size ...passed 00:08:45.967 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:45.967 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:45.967 Test: blockdev write read max offset ...passed 00:08:45.967 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:45.967 Test: blockdev writev readv 8 blocks ...passed 00:08:45.967 Test: blockdev writev readv 30 x 1block ...passed 00:08:45.967 Test: blockdev writev readv block ...passed 00:08:45.967 Test: blockdev writev readv size > 128k ...passed 00:08:45.967 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:45.967 Test: blockdev comparev and writev ...passed 00:08:45.967 Test: blockdev nvme passthru rw ...passed 00:08:45.967 Test: blockdev nvme passthru vendor specific ...passed 00:08:45.967 Test: blockdev nvme admin passthru ...passed 00:08:45.967 Test: blockdev copy ...passed 00:08:45.967 Suite: bdevio tests on: Malloc2p3 00:08:45.967 Test: blockdev write read block ...passed 00:08:45.967 Test: blockdev write zeroes read block ...passed 00:08:45.967 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.226 Test: blockdev write zeroes read split partial ...passed 00:08:46.226 Test: blockdev reset ...passed 00:08:46.226 Test: blockdev write read 8 blocks ...passed 00:08:46.226 Test: blockdev write read size > 128k ...passed 00:08:46.226 Test: blockdev write read invalid size ...passed 00:08:46.226 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.226 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.226 Test: blockdev write read max offset ...passed 00:08:46.226 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.226 Test: blockdev writev readv 8 blocks ...passed 00:08:46.226 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.226 Test: blockdev writev readv block ...passed 00:08:46.226 Test: blockdev writev readv size > 128k ...passed 00:08:46.226 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.226 Test: blockdev comparev and writev ...passed 00:08:46.226 Test: blockdev nvme passthru rw ...passed 00:08:46.226 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.226 Test: blockdev nvme admin passthru ...passed 00:08:46.226 Test: blockdev copy ...passed 00:08:46.226 Suite: bdevio tests on: Malloc2p2 00:08:46.226 Test: blockdev write read block ...passed 00:08:46.226 Test: blockdev write zeroes read block ...passed 00:08:46.226 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.226 Test: blockdev write zeroes read split partial ...passed 00:08:46.226 Test: blockdev reset ...passed 00:08:46.226 Test: blockdev write read 8 blocks ...passed 00:08:46.226 Test: blockdev write read size > 128k ...passed 00:08:46.226 Test: blockdev write read invalid size ...passed 00:08:46.226 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.226 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.226 Test: blockdev write read max offset ...passed 00:08:46.226 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.226 Test: blockdev writev readv 8 blocks ...passed 00:08:46.226 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.226 Test: blockdev writev readv block ...passed 00:08:46.226 Test: blockdev writev readv size > 128k ...passed 00:08:46.226 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.226 Test: blockdev comparev and writev ...passed 00:08:46.226 Test: blockdev nvme passthru rw ...passed 00:08:46.226 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.226 Test: blockdev nvme admin passthru ...passed 00:08:46.226 Test: blockdev copy ...passed 00:08:46.226 Suite: bdevio tests on: Malloc2p1 00:08:46.226 Test: blockdev write read block ...passed 00:08:46.226 Test: blockdev write zeroes read block ...passed 00:08:46.226 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.226 Test: blockdev write zeroes read split partial ...passed 00:08:46.226 Test: blockdev reset ...passed 00:08:46.226 Test: blockdev write read 8 blocks ...passed 00:08:46.226 Test: blockdev write read size > 128k ...passed 00:08:46.226 Test: blockdev write read invalid size ...passed 00:08:46.226 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.226 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.226 Test: blockdev write read max offset ...passed 00:08:46.226 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.226 Test: blockdev writev readv 8 blocks ...passed 00:08:46.226 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.226 Test: blockdev writev readv block ...passed 00:08:46.226 Test: blockdev writev readv size > 128k ...passed 00:08:46.226 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.226 Test: blockdev comparev and writev ...passed 00:08:46.226 Test: blockdev nvme passthru rw ...passed 00:08:46.226 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.226 Test: blockdev nvme admin passthru ...passed 00:08:46.226 Test: blockdev copy ...passed 00:08:46.226 Suite: bdevio tests on: Malloc2p0 00:08:46.226 Test: blockdev write read block ...passed 00:08:46.226 Test: blockdev write zeroes read block ...passed 00:08:46.226 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.226 Test: blockdev write zeroes read split partial ...passed 00:08:46.226 Test: blockdev reset ...passed 00:08:46.226 Test: blockdev write read 8 blocks ...passed 00:08:46.226 Test: blockdev write read size > 128k ...passed 00:08:46.226 Test: blockdev write read invalid size ...passed 00:08:46.226 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.226 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.226 Test: blockdev write read max offset ...passed 00:08:46.226 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.226 Test: blockdev writev readv 8 blocks ...passed 00:08:46.226 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.226 Test: blockdev writev readv block ...passed 00:08:46.226 Test: blockdev writev readv size > 128k ...passed 00:08:46.226 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.226 Test: blockdev comparev and writev ...passed 00:08:46.226 Test: blockdev nvme passthru rw ...passed 00:08:46.226 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.226 Test: blockdev nvme admin passthru ...passed 00:08:46.226 Test: blockdev copy ...passed 00:08:46.226 Suite: bdevio tests on: Malloc1p1 00:08:46.226 Test: blockdev write read block ...passed 00:08:46.226 Test: blockdev write zeroes read block ...passed 00:08:46.226 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.226 Test: blockdev write zeroes read split partial ...passed 00:08:46.226 Test: blockdev reset ...passed 00:08:46.226 Test: blockdev write read 8 blocks ...passed 00:08:46.226 Test: blockdev write read size > 128k ...passed 00:08:46.226 Test: blockdev write read invalid size ...passed 00:08:46.226 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.226 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.226 Test: blockdev write read max offset ...passed 00:08:46.226 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.226 Test: blockdev writev readv 8 blocks ...passed 00:08:46.226 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.226 Test: blockdev writev readv block ...passed 00:08:46.226 Test: blockdev writev readv size > 128k ...passed 00:08:46.226 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.226 Test: blockdev comparev and writev ...passed 00:08:46.226 Test: blockdev nvme passthru rw ...passed 00:08:46.226 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.226 Test: blockdev nvme admin passthru ...passed 00:08:46.226 Test: blockdev copy ...passed 00:08:46.226 Suite: bdevio tests on: Malloc1p0 00:08:46.226 Test: blockdev write read block ...passed 00:08:46.226 Test: blockdev write zeroes read block ...passed 00:08:46.226 Test: blockdev write zeroes read no split ...passed 00:08:46.226 Test: blockdev write zeroes read split ...passed 00:08:46.227 Test: blockdev write zeroes read split partial ...passed 00:08:46.227 Test: blockdev reset ...passed 00:08:46.227 Test: blockdev write read 8 blocks ...passed 00:08:46.227 Test: blockdev write read size > 128k ...passed 00:08:46.227 Test: blockdev write read invalid size ...passed 00:08:46.227 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.227 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.227 Test: blockdev write read max offset ...passed 00:08:46.227 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.227 Test: blockdev writev readv 8 blocks ...passed 00:08:46.227 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.227 Test: blockdev writev readv block ...passed 00:08:46.227 Test: blockdev writev readv size > 128k ...passed 00:08:46.227 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.227 Test: blockdev comparev and writev ...passed 00:08:46.227 Test: blockdev nvme passthru rw ...passed 00:08:46.227 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.227 Test: blockdev nvme admin passthru ...passed 00:08:46.227 Test: blockdev copy ...passed 00:08:46.227 Suite: bdevio tests on: Malloc0 00:08:46.227 Test: blockdev write read block ...passed 00:08:46.227 Test: blockdev write zeroes read block ...passed 00:08:46.227 Test: blockdev write zeroes read no split ...passed 00:08:46.227 Test: blockdev write zeroes read split ...passed 00:08:46.227 Test: blockdev write zeroes read split partial ...passed 00:08:46.227 Test: blockdev reset ...passed 00:08:46.227 Test: blockdev write read 8 blocks ...passed 00:08:46.227 Test: blockdev write read size > 128k ...passed 00:08:46.227 Test: blockdev write read invalid size ...passed 00:08:46.227 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:46.227 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:46.227 Test: blockdev write read max offset ...passed 00:08:46.227 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:46.227 Test: blockdev writev readv 8 blocks ...passed 00:08:46.227 Test: blockdev writev readv 30 x 1block ...passed 00:08:46.227 Test: blockdev writev readv block ...passed 00:08:46.227 Test: blockdev writev readv size > 128k ...passed 00:08:46.227 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:46.227 Test: blockdev comparev and writev ...passed 00:08:46.227 Test: blockdev nvme passthru rw ...passed 00:08:46.227 Test: blockdev nvme passthru vendor specific ...passed 00:08:46.227 Test: blockdev nvme admin passthru ...passed 00:08:46.227 Test: blockdev copy ...passed 00:08:46.227 00:08:46.227 Run Summary: Type Total Ran Passed Failed Inactive 00:08:46.227 suites 16 16 n/a 0 0 00:08:46.227 tests 368 368 368 0 0 00:08:46.227 asserts 2224 2224 2224 0 n/a 00:08:46.227 00:08:46.227 Elapsed time = 0.506 seconds 00:08:46.227 0 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 65570 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 65570 ']' 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 65570 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 65570 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 65570' 00:08:46.227 killing process with pid 65570 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 65570 00:08:46.227 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 65570 00:08:46.485 09:12:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:46.485 00:08:46.485 real 0m1.735s 00:08:46.485 user 0m4.316s 00:08:46.485 sys 0m0.515s 00:08:46.485 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:46.485 09:12:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:46.485 ************************************ 00:08:46.485 END TEST bdev_bounds 00:08:46.485 ************************************ 00:08:46.485 09:12:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:46.485 09:12:55 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:46.485 09:12:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:46.485 09:12:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:46.485 09:12:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:46.744 ************************************ 00:08:46.744 START TEST bdev_nbd 00:08:46.744 ************************************ 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=65786 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 65786 /var/tmp/spdk-nbd.sock 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 65786 ']' 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:46.744 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:46.744 09:12:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:46.744 [2024-07-15 09:12:55.542812] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:08:46.744 [2024-07-15 09:12:55.542882] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:46.744 [2024-07-15 09:12:55.674389] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.003 [2024-07-15 09:12:55.776711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.003 [2024-07-15 09:12:55.934276] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:47.003 [2024-07-15 09:12:55.934343] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:47.003 [2024-07-15 09:12:55.934360] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:47.003 [2024-07-15 09:12:55.942280] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:47.003 [2024-07-15 09:12:55.942310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:47.003 [2024-07-15 09:12:55.950290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:47.003 [2024-07-15 09:12:55.950316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:47.261 [2024-07-15 09:12:56.027162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:47.261 [2024-07-15 09:12:56.027215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:47.261 [2024-07-15 09:12:56.027233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1192a40 00:08:47.261 [2024-07-15 09:12:56.027246] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:47.261 [2024-07-15 09:12:56.028709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:47.261 [2024-07-15 09:12:56.028748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:47.519 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.777 1+0 records in 00:08:47.777 1+0 records out 00:08:47.777 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023426 s, 17.5 MB/s 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:47.777 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:48.035 1+0 records in 00:08:48.035 1+0 records out 00:08:48.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027295 s, 15.0 MB/s 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:48.035 09:12:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:48.293 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:48.550 1+0 records in 00:08:48.550 1+0 records out 00:08:48.550 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340566 s, 12.0 MB/s 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:48.550 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:48.807 1+0 records in 00:08:48.807 1+0 records out 00:08:48.807 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311928 s, 13.1 MB/s 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:48.807 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.063 1+0 records in 00:08:49.063 1+0 records out 00:08:49.063 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337955 s, 12.1 MB/s 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.063 09:12:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.320 1+0 records in 00:08:49.320 1+0 records out 00:08:49.320 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000416859 s, 9.8 MB/s 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.320 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.577 1+0 records in 00:08:49.577 1+0 records out 00:08:49.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000451854 s, 9.1 MB/s 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.577 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.835 1+0 records in 00:08:49.835 1+0 records out 00:08:49.835 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000408488 s, 10.0 MB/s 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.835 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.092 1+0 records in 00:08:50.092 1+0 records out 00:08:50.092 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000563329 s, 7.3 MB/s 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.092 09:12:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.352 1+0 records in 00:08:50.352 1+0 records out 00:08:50.352 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000473104 s, 8.7 MB/s 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.352 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.672 1+0 records in 00:08:50.672 1+0 records out 00:08:50.672 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000463489 s, 8.8 MB/s 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.672 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.929 1+0 records in 00:08:50.929 1+0 records out 00:08:50.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000555952 s, 7.4 MB/s 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.929 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.929 1+0 records in 00:08:50.929 1+0 records out 00:08:50.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000528704 s, 7.7 MB/s 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.185 09:12:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.441 1+0 records in 00:08:51.441 1+0 records out 00:08:51.441 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000770959 s, 5.3 MB/s 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.441 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.441 1+0 records in 00:08:51.442 1+0 records out 00:08:51.442 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000828097 s, 4.9 MB/s 00:08:51.442 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.698 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.956 1+0 records in 00:08:51.956 1+0 records out 00:08:51.956 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000725771 s, 5.6 MB/s 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:51.956 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:51.956 { 00:08:51.956 "nbd_device": "/dev/nbd0", 00:08:51.956 "bdev_name": "Malloc0" 00:08:51.956 }, 00:08:51.956 { 00:08:51.956 "nbd_device": "/dev/nbd1", 00:08:51.956 "bdev_name": "Malloc1p0" 00:08:51.956 }, 00:08:51.956 { 00:08:51.956 "nbd_device": "/dev/nbd2", 00:08:51.957 "bdev_name": "Malloc1p1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd3", 00:08:51.957 "bdev_name": "Malloc2p0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd4", 00:08:51.957 "bdev_name": "Malloc2p1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd5", 00:08:51.957 "bdev_name": "Malloc2p2" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd6", 00:08:51.957 "bdev_name": "Malloc2p3" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd7", 00:08:51.957 "bdev_name": "Malloc2p4" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd8", 00:08:51.957 "bdev_name": "Malloc2p5" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd9", 00:08:51.957 "bdev_name": "Malloc2p6" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd10", 00:08:51.957 "bdev_name": "Malloc2p7" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd11", 00:08:51.957 "bdev_name": "TestPT" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd12", 00:08:51.957 "bdev_name": "raid0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd13", 00:08:51.957 "bdev_name": "concat0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd14", 00:08:51.957 "bdev_name": "raid1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd15", 00:08:51.957 "bdev_name": "AIO0" 00:08:51.957 } 00:08:51.957 ]' 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd0", 00:08:51.957 "bdev_name": "Malloc0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd1", 00:08:51.957 "bdev_name": "Malloc1p0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd2", 00:08:51.957 "bdev_name": "Malloc1p1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd3", 00:08:51.957 "bdev_name": "Malloc2p0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd4", 00:08:51.957 "bdev_name": "Malloc2p1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd5", 00:08:51.957 "bdev_name": "Malloc2p2" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd6", 00:08:51.957 "bdev_name": "Malloc2p3" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd7", 00:08:51.957 "bdev_name": "Malloc2p4" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd8", 00:08:51.957 "bdev_name": "Malloc2p5" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd9", 00:08:51.957 "bdev_name": "Malloc2p6" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd10", 00:08:51.957 "bdev_name": "Malloc2p7" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd11", 00:08:51.957 "bdev_name": "TestPT" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd12", 00:08:51.957 "bdev_name": "raid0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd13", 00:08:51.957 "bdev_name": "concat0" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd14", 00:08:51.957 "bdev_name": "raid1" 00:08:51.957 }, 00:08:51.957 { 00:08:51.957 "nbd_device": "/dev/nbd15", 00:08:51.957 "bdev_name": "AIO0" 00:08:51.957 } 00:08:51.957 ]' 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.957 09:13:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:52.215 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:52.215 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.472 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.730 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.988 09:13:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.245 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.503 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.761 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.019 09:13:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.277 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.535 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.794 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:55.052 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.053 09:13:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.312 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.570 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.830 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.088 09:13:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.347 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:56.605 /dev/nbd0 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.605 1+0 records in 00:08:56.605 1+0 records out 00:08:56.605 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261147 s, 15.7 MB/s 00:08:56.605 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.606 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:56.894 /dev/nbd1 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.894 1+0 records in 00:08:56.894 1+0 records out 00:08:56.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291244 s, 14.1 MB/s 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.894 09:13:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:57.153 /dev/nbd10 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:57.153 1+0 records in 00:08:57.153 1+0 records out 00:08:57.153 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286497 s, 14.3 MB/s 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:57.153 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:57.411 /dev/nbd11 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:57.411 1+0 records in 00:08:57.411 1+0 records out 00:08:57.411 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341024 s, 12.0 MB/s 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:57.411 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:57.670 /dev/nbd12 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:57.670 1+0 records in 00:08:57.670 1+0 records out 00:08:57.670 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00039032 s, 10.5 MB/s 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:57.670 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:57.928 /dev/nbd13 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:57.928 1+0 records in 00:08:57.928 1+0 records out 00:08:57.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00043763 s, 9.4 MB/s 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:57.928 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:58.186 /dev/nbd14 00:08:58.186 09:13:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:58.186 1+0 records in 00:08:58.186 1+0 records out 00:08:58.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000463938 s, 8.8 MB/s 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.186 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:58.444 /dev/nbd15 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:58.444 1+0 records in 00:08:58.444 1+0 records out 00:08:58.444 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000500888 s, 8.2 MB/s 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.444 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:58.702 /dev/nbd2 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:58.702 1+0 records in 00:08:58.702 1+0 records out 00:08:58.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00050064 s, 8.2 MB/s 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.702 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:58.959 /dev/nbd3 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:58.959 1+0 records in 00:08:58.959 1+0 records out 00:08:58.959 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000480328 s, 8.5 MB/s 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.959 09:13:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:59.216 /dev/nbd4 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:59.216 1+0 records in 00:08:59.216 1+0 records out 00:08:59.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00057608 s, 7.1 MB/s 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:59.216 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:59.473 /dev/nbd5 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:59.473 1+0 records in 00:08:59.473 1+0 records out 00:08:59.473 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00139827 s, 2.9 MB/s 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:59.473 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:00.037 /dev/nbd6 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.037 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.038 1+0 records in 00:09:00.038 1+0 records out 00:09:00.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000560831 s, 7.3 MB/s 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.038 09:13:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:00.295 /dev/nbd7 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.295 1+0 records in 00:09:00.295 1+0 records out 00:09:00.295 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000586258 s, 7.0 MB/s 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.295 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:00.553 /dev/nbd8 00:09:00.553 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:00.553 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:00.553 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:00.553 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.554 1+0 records in 00:09:00.554 1+0 records out 00:09:00.554 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000539835 s, 7.6 MB/s 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.554 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:00.812 /dev/nbd9 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.812 1+0 records in 00:09:00.812 1+0 records out 00:09:00.812 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000454517 s, 9.0 MB/s 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.812 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:00.813 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:00.813 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:01.071 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd0", 00:09:01.071 "bdev_name": "Malloc0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd1", 00:09:01.071 "bdev_name": "Malloc1p0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd10", 00:09:01.071 "bdev_name": "Malloc1p1" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd11", 00:09:01.071 "bdev_name": "Malloc2p0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd12", 00:09:01.071 "bdev_name": "Malloc2p1" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd13", 00:09:01.071 "bdev_name": "Malloc2p2" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd14", 00:09:01.071 "bdev_name": "Malloc2p3" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd15", 00:09:01.071 "bdev_name": "Malloc2p4" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd2", 00:09:01.071 "bdev_name": "Malloc2p5" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd3", 00:09:01.071 "bdev_name": "Malloc2p6" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd4", 00:09:01.071 "bdev_name": "Malloc2p7" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd5", 00:09:01.071 "bdev_name": "TestPT" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd6", 00:09:01.071 "bdev_name": "raid0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd7", 00:09:01.071 "bdev_name": "concat0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd8", 00:09:01.071 "bdev_name": "raid1" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd9", 00:09:01.071 "bdev_name": "AIO0" 00:09:01.071 } 00:09:01.071 ]' 00:09:01.071 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd0", 00:09:01.071 "bdev_name": "Malloc0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd1", 00:09:01.071 "bdev_name": "Malloc1p0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd10", 00:09:01.071 "bdev_name": "Malloc1p1" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd11", 00:09:01.071 "bdev_name": "Malloc2p0" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd12", 00:09:01.071 "bdev_name": "Malloc2p1" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd13", 00:09:01.071 "bdev_name": "Malloc2p2" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd14", 00:09:01.071 "bdev_name": "Malloc2p3" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd15", 00:09:01.071 "bdev_name": "Malloc2p4" 00:09:01.071 }, 00:09:01.071 { 00:09:01.071 "nbd_device": "/dev/nbd2", 00:09:01.071 "bdev_name": "Malloc2p5" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd3", 00:09:01.072 "bdev_name": "Malloc2p6" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd4", 00:09:01.072 "bdev_name": "Malloc2p7" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd5", 00:09:01.072 "bdev_name": "TestPT" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd6", 00:09:01.072 "bdev_name": "raid0" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd7", 00:09:01.072 "bdev_name": "concat0" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd8", 00:09:01.072 "bdev_name": "raid1" 00:09:01.072 }, 00:09:01.072 { 00:09:01.072 "nbd_device": "/dev/nbd9", 00:09:01.072 "bdev_name": "AIO0" 00:09:01.072 } 00:09:01.072 ]' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:01.072 /dev/nbd1 00:09:01.072 /dev/nbd10 00:09:01.072 /dev/nbd11 00:09:01.072 /dev/nbd12 00:09:01.072 /dev/nbd13 00:09:01.072 /dev/nbd14 00:09:01.072 /dev/nbd15 00:09:01.072 /dev/nbd2 00:09:01.072 /dev/nbd3 00:09:01.072 /dev/nbd4 00:09:01.072 /dev/nbd5 00:09:01.072 /dev/nbd6 00:09:01.072 /dev/nbd7 00:09:01.072 /dev/nbd8 00:09:01.072 /dev/nbd9' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:01.072 /dev/nbd1 00:09:01.072 /dev/nbd10 00:09:01.072 /dev/nbd11 00:09:01.072 /dev/nbd12 00:09:01.072 /dev/nbd13 00:09:01.072 /dev/nbd14 00:09:01.072 /dev/nbd15 00:09:01.072 /dev/nbd2 00:09:01.072 /dev/nbd3 00:09:01.072 /dev/nbd4 00:09:01.072 /dev/nbd5 00:09:01.072 /dev/nbd6 00:09:01.072 /dev/nbd7 00:09:01.072 /dev/nbd8 00:09:01.072 /dev/nbd9' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:01.072 256+0 records in 00:09:01.072 256+0 records out 00:09:01.072 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011496 s, 91.2 MB/s 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:01.072 09:13:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:01.330 256+0 records in 00:09:01.330 256+0 records out 00:09:01.330 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185511 s, 5.7 MB/s 00:09:01.330 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:01.330 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:01.589 256+0 records in 00:09:01.589 256+0 records out 00:09:01.589 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185373 s, 5.7 MB/s 00:09:01.589 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:01.589 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:01.589 256+0 records in 00:09:01.589 256+0 records out 00:09:01.589 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185469 s, 5.7 MB/s 00:09:01.589 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:01.589 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:01.847 256+0 records in 00:09:01.847 256+0 records out 00:09:01.847 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185399 s, 5.7 MB/s 00:09:01.847 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:01.847 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:02.105 256+0 records in 00:09:02.105 256+0 records out 00:09:02.105 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.170699 s, 6.1 MB/s 00:09:02.105 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.105 09:13:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:02.363 256+0 records in 00:09:02.364 256+0 records out 00:09:02.364 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.165598 s, 6.3 MB/s 00:09:02.364 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.364 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:02.364 256+0 records in 00:09:02.364 256+0 records out 00:09:02.364 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184531 s, 5.7 MB/s 00:09:02.364 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.364 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:02.622 256+0 records in 00:09:02.622 256+0 records out 00:09:02.622 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184265 s, 5.7 MB/s 00:09:02.622 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.622 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:02.880 256+0 records in 00:09:02.880 256+0 records out 00:09:02.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18501 s, 5.7 MB/s 00:09:02.880 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.880 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:02.880 256+0 records in 00:09:02.880 256+0 records out 00:09:02.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183176 s, 5.7 MB/s 00:09:02.880 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.880 09:13:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:03.139 256+0 records in 00:09:03.139 256+0 records out 00:09:03.139 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184697 s, 5.7 MB/s 00:09:03.139 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.139 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:03.396 256+0 records in 00:09:03.396 256+0 records out 00:09:03.396 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.179349 s, 5.8 MB/s 00:09:03.396 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.396 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:03.396 256+0 records in 00:09:03.396 256+0 records out 00:09:03.396 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.124549 s, 8.4 MB/s 00:09:03.396 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.396 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:03.654 256+0 records in 00:09:03.654 256+0 records out 00:09:03.654 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181109 s, 5.8 MB/s 00:09:03.654 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.654 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:03.912 256+0 records in 00:09:03.912 256+0 records out 00:09:03.912 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188622 s, 5.6 MB/s 00:09:03.912 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.912 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:04.173 256+0 records in 00:09:04.173 256+0 records out 00:09:04.173 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183796 s, 5.7 MB/s 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:04.173 09:13:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.173 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.453 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.711 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.971 09:13:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:05.229 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:05.487 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:05.745 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:06.003 09:13:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:06.261 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:06.519 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:06.778 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:06.778 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:06.778 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:06.778 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.037 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:07.297 09:13:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.297 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.556 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:07.815 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.074 09:13:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.333 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:08.592 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:08.851 malloc_lvol_verify 00:09:08.851 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:09.110 2415c4c2-2271-4495-ac6f-6f5a669a2ee6 00:09:09.110 09:13:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:09.369 0c72540c-fc94-437d-9477-59466076c21e 00:09:09.369 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:09.629 /dev/nbd0 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:09.629 mke2fs 1.46.5 (30-Dec-2021) 00:09:09.629 Discarding device blocks: 0/4096 done 00:09:09.629 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:09.629 00:09:09.629 Allocating group tables: 0/1 done 00:09:09.629 Writing inode tables: 0/1 done 00:09:09.629 Creating journal (1024 blocks): done 00:09:09.629 Writing superblocks and filesystem accounting information: 0/1 done 00:09:09.629 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.629 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 65786 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 65786 ']' 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 65786 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 65786 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 65786' 00:09:09.888 killing process with pid 65786 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 65786 00:09:09.888 09:13:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 65786 00:09:10.146 09:13:19 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:10.146 00:09:10.146 real 0m23.609s 00:09:10.146 user 0m28.657s 00:09:10.146 sys 0m14.033s 00:09:10.146 09:13:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.146 09:13:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:10.146 ************************************ 00:09:10.146 END TEST bdev_nbd 00:09:10.146 ************************************ 00:09:10.404 09:13:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:10.404 09:13:19 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:10.404 09:13:19 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:10.404 09:13:19 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:10.404 09:13:19 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:10.404 09:13:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:10.404 09:13:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.404 09:13:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:10.404 ************************************ 00:09:10.404 START TEST bdev_fio 00:09:10.404 ************************************ 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:10.404 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:10.404 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.405 09:13:19 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:10.405 ************************************ 00:09:10.405 START TEST bdev_fio_rw_verify 00:09:10.405 ************************************ 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:10.405 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:10.668 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:10.668 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:10.668 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:10.668 09:13:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:10.924 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:10.924 fio-3.35 00:09:10.924 Starting 16 threads 00:09:23.152 00:09:23.152 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=69698: Mon Jul 15 09:13:31 2024 00:09:23.152 read: IOPS=88.0k, BW=344MiB/s (361MB/s)(3439MiB/10001msec) 00:09:23.152 slat (usec): min=2, max=4176, avg=36.35, stdev=15.02 00:09:23.152 clat (usec): min=12, max=4625, avg=298.92, stdev=132.67 00:09:23.152 lat (usec): min=21, max=4666, avg=335.27, stdev=140.29 00:09:23.152 clat percentiles (usec): 00:09:23.152 | 50.000th=[ 293], 99.000th=[ 586], 99.900th=[ 766], 99.990th=[ 947], 00:09:23.152 | 99.999th=[ 1401] 00:09:23.152 write: IOPS=137k, BW=537MiB/s (563MB/s)(5294MiB/9857msec); 0 zone resets 00:09:23.152 slat (usec): min=8, max=437, avg=50.17, stdev=15.24 00:09:23.152 clat (usec): min=14, max=1897, avg=355.21, stdev=159.12 00:09:23.152 lat (usec): min=43, max=2006, avg=405.38, stdev=166.86 00:09:23.152 clat percentiles (usec): 00:09:23.152 | 50.000th=[ 338], 99.000th=[ 799], 99.900th=[ 979], 99.990th=[ 1057], 00:09:23.152 | 99.999th=[ 1221] 00:09:23.152 bw ( KiB/s): min=465416, max=741971, per=98.80%, avg=543343.74, stdev=4336.54, samples=304 00:09:23.152 iops : min=116354, max=185492, avg=135836.00, stdev=1084.12, samples=304 00:09:23.152 lat (usec) : 20=0.01%, 50=0.34%, 100=3.50%, 250=29.76%, 500=51.83% 00:09:23.152 lat (usec) : 750=13.72%, 1000=0.81% 00:09:23.152 lat (msec) : 2=0.04%, 10=0.01% 00:09:23.152 cpu : usr=99.22%, sys=0.36%, ctx=656, majf=0, minf=2781 00:09:23.152 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:23.152 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:23.152 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:23.152 issued rwts: total=880468,1355140,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:23.152 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:23.152 00:09:23.152 Run status group 0 (all jobs): 00:09:23.153 READ: bw=344MiB/s (361MB/s), 344MiB/s-344MiB/s (361MB/s-361MB/s), io=3439MiB (3606MB), run=10001-10001msec 00:09:23.153 WRITE: bw=537MiB/s (563MB/s), 537MiB/s-537MiB/s (563MB/s-563MB/s), io=5294MiB (5551MB), run=9857-9857msec 00:09:23.153 00:09:23.153 real 0m12.277s 00:09:23.153 user 2m45.801s 00:09:23.153 sys 0m1.401s 00:09:23.153 09:13:31 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.153 09:13:31 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:23.153 ************************************ 00:09:23.153 END TEST bdev_fio_rw_verify 00:09:23.153 ************************************ 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:23.153 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:23.154 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "9d434293-4838-4844-8a94-ff05f239b5cf"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d434293-4838-4844-8a94-ff05f239b5cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "7f9948b0-3712-54e2-8746-4caed203ad9a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7f9948b0-3712-54e2-8746-4caed203ad9a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e4a2d656-8e4d-59e6-a010-87f03c3fe05d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e4a2d656-8e4d-59e6-a010-87f03c3fe05d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "01f8811e-da90-514d-a368-ca6c62342984"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "01f8811e-da90-514d-a368-ca6c62342984",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "52b553cf-6865-5fec-a1bb-8577821c1d71"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52b553cf-6865-5fec-a1bb-8577821c1d71",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4dbadad4-929c-5842-8875-98548860095d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4dbadad4-929c-5842-8875-98548860095d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d230eb87-cbba-5020-8d68-b2df9505c49c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d230eb87-cbba-5020-8d68-b2df9505c49c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "3185c29a-e08b-5ef3-8361-c01c142053da"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3185c29a-e08b-5ef3-8361-c01c142053da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "09d616e4-d884-5312-8f07-75409a4fa3cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09d616e4-d884-5312-8f07-75409a4fa3cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "38c52917-3a44-5583-8ea8-00269e3e9289"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "38c52917-3a44-5583-8ea8-00269e3e9289",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "67e2ba06-5292-590a-bfb2-77e5e39162ee"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e2ba06-5292-590a-bfb2-77e5e39162ee",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4f0ff21-2901-5825-8f0e-8204bbde143e"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4f0ff21-2901-5825-8f0e-8204bbde143e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "428cfc74-3bf1-4a4b-a3bc-2a2fed04e202",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f8f8c968-bfe5-4a1e-a0c0-98869704e9b7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f0efb1ce-4bd3-47a3-8863-1d70b87e5763"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "3fc94c3f-7d00-4841-845d-13d9683244d3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a0403874-7e7e-45ee-b974-b43b5f23ee26",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "879c81c3-edbc-4722-ad4c-6e584e2a6572"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d381df3f-20bc-4158-b14b-5813c0a8c4c3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "2c15ca4b-eadd-4af1-9c5c-c55127866c92",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:23.154 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:23.154 Malloc1p0 00:09:23.154 Malloc1p1 00:09:23.154 Malloc2p0 00:09:23.154 Malloc2p1 00:09:23.154 Malloc2p2 00:09:23.154 Malloc2p3 00:09:23.154 Malloc2p4 00:09:23.154 Malloc2p5 00:09:23.154 Malloc2p6 00:09:23.154 Malloc2p7 00:09:23.154 TestPT 00:09:23.154 raid0 00:09:23.154 concat0 ]] 00:09:23.154 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "9d434293-4838-4844-8a94-ff05f239b5cf"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d434293-4838-4844-8a94-ff05f239b5cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "7f9948b0-3712-54e2-8746-4caed203ad9a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7f9948b0-3712-54e2-8746-4caed203ad9a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e4a2d656-8e4d-59e6-a010-87f03c3fe05d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e4a2d656-8e4d-59e6-a010-87f03c3fe05d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "01f8811e-da90-514d-a368-ca6c62342984"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "01f8811e-da90-514d-a368-ca6c62342984",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "52b553cf-6865-5fec-a1bb-8577821c1d71"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52b553cf-6865-5fec-a1bb-8577821c1d71",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4dbadad4-929c-5842-8875-98548860095d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4dbadad4-929c-5842-8875-98548860095d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d230eb87-cbba-5020-8d68-b2df9505c49c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d230eb87-cbba-5020-8d68-b2df9505c49c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "3185c29a-e08b-5ef3-8361-c01c142053da"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3185c29a-e08b-5ef3-8361-c01c142053da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "09d616e4-d884-5312-8f07-75409a4fa3cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09d616e4-d884-5312-8f07-75409a4fa3cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "38c52917-3a44-5583-8ea8-00269e3e9289"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "38c52917-3a44-5583-8ea8-00269e3e9289",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "67e2ba06-5292-590a-bfb2-77e5e39162ee"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e2ba06-5292-590a-bfb2-77e5e39162ee",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e4f0ff21-2901-5825-8f0e-8204bbde143e"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e4f0ff21-2901-5825-8f0e-8204bbde143e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "33b9b487-3efc-4acc-8fdc-a2a9d02d50c6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "428cfc74-3bf1-4a4b-a3bc-2a2fed04e202",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f8f8c968-bfe5-4a1e-a0c0-98869704e9b7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f0efb1ce-4bd3-47a3-8863-1d70b87e5763"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f0efb1ce-4bd3-47a3-8863-1d70b87e5763",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "3fc94c3f-7d00-4841-845d-13d9683244d3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a0403874-7e7e-45ee-b974-b43b5f23ee26",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "879c81c3-edbc-4722-ad4c-6e584e2a6572"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "879c81c3-edbc-4722-ad4c-6e584e2a6572",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d381df3f-20bc-4158-b14b-5813c0a8c4c3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "2c15ca4b-eadd-4af1-9c5c-c55127866c92",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ad2cf2c5-5ee8-4b9b-8db4-4d51fd0f6953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:23.155 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.156 09:13:31 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:23.156 ************************************ 00:09:23.156 START TEST bdev_fio_trim 00:09:23.156 ************************************ 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:23.156 09:13:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:23.413 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:23.413 fio-3.35 00:09:23.413 Starting 14 threads 00:09:35.604 00:09:35.604 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=71491: Mon Jul 15 09:13:42 2024 00:09:35.604 write: IOPS=125k, BW=490MiB/s (514MB/s)(4901MiB/10002msec); 0 zone resets 00:09:35.604 slat (usec): min=3, max=481, avg=39.42, stdev=10.38 00:09:35.604 clat (usec): min=30, max=3563, avg=281.09, stdev=92.49 00:09:35.604 lat (usec): min=40, max=3598, avg=320.50, stdev=96.09 00:09:35.604 clat percentiles (usec): 00:09:35.604 | 50.000th=[ 273], 99.000th=[ 482], 99.900th=[ 529], 99.990th=[ 594], 00:09:35.604 | 99.999th=[ 791] 00:09:35.604 bw ( KiB/s): min=461824, max=643992, per=100.00%, avg=502931.79, stdev=2950.11, samples=266 00:09:35.604 iops : min=115456, max=160997, avg=125733.00, stdev=737.52, samples=266 00:09:35.604 trim: IOPS=125k, BW=490MiB/s (514MB/s)(4901MiB/10002msec); 0 zone resets 00:09:35.604 slat (usec): min=4, max=135, avg=26.35, stdev= 6.82 00:09:35.604 clat (usec): min=4, max=3598, avg=315.86, stdev=101.69 00:09:35.604 lat (usec): min=15, max=3629, avg=342.21, stdev=104.59 00:09:35.604 clat percentiles (usec): 00:09:35.604 | 50.000th=[ 310], 99.000th=[ 529], 99.900th=[ 578], 99.990th=[ 644], 00:09:35.604 | 99.999th=[ 898] 00:09:35.604 bw ( KiB/s): min=461824, max=644000, per=100.00%, avg=502932.63, stdev=2950.28, samples=266 00:09:35.604 iops : min=115456, max=160999, avg=125733.11, stdev=737.56, samples=266 00:09:35.604 lat (usec) : 10=0.01%, 20=0.01%, 50=0.04%, 100=0.86%, 250=34.31% 00:09:35.604 lat (usec) : 500=63.26%, 750=1.53%, 1000=0.01% 00:09:35.604 lat (msec) : 2=0.01%, 4=0.01% 00:09:35.604 cpu : usr=99.60%, sys=0.01%, ctx=530, majf=0, minf=991 00:09:35.604 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:35.604 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:35.604 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:35.604 issued rwts: total=0,1254532,1254535,0 short=0,0,0,0 dropped=0,0,0,0 00:09:35.604 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:35.604 00:09:35.604 Run status group 0 (all jobs): 00:09:35.604 WRITE: bw=490MiB/s (514MB/s), 490MiB/s-490MiB/s (514MB/s-514MB/s), io=4901MiB (5139MB), run=10002-10002msec 00:09:35.604 TRIM: bw=490MiB/s (514MB/s), 490MiB/s-490MiB/s (514MB/s-514MB/s), io=4901MiB (5139MB), run=10002-10002msec 00:09:35.604 00:09:35.604 real 0m11.406s 00:09:35.604 user 2m25.137s 00:09:35.604 sys 0m0.908s 00:09:35.604 09:13:43 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.604 09:13:43 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:35.604 ************************************ 00:09:35.604 END TEST bdev_fio_trim 00:09:35.604 ************************************ 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:35.604 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:35.604 00:09:35.604 real 0m24.054s 00:09:35.604 user 5m11.132s 00:09:35.604 sys 0m2.512s 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.604 09:13:43 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:35.604 ************************************ 00:09:35.604 END TEST bdev_fio 00:09:35.604 ************************************ 00:09:35.604 09:13:43 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:35.604 09:13:43 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:35.604 09:13:43 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:35.604 09:13:43 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:35.604 09:13:43 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:35.604 09:13:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:35.604 ************************************ 00:09:35.604 START TEST bdev_verify 00:09:35.604 ************************************ 00:09:35.604 09:13:43 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:35.604 [2024-07-15 09:13:43.362796] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:35.604 [2024-07-15 09:13:43.362861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72935 ] 00:09:35.604 [2024-07-15 09:13:43.490899] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:35.605 [2024-07-15 09:13:43.599345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:35.605 [2024-07-15 09:13:43.599351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.605 [2024-07-15 09:13:43.755387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:35.605 [2024-07-15 09:13:43.755451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:35.605 [2024-07-15 09:13:43.755466] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:35.605 [2024-07-15 09:13:43.763394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:35.605 [2024-07-15 09:13:43.763422] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:35.605 [2024-07-15 09:13:43.771407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:35.605 [2024-07-15 09:13:43.771432] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:35.605 [2024-07-15 09:13:43.848459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:35.605 [2024-07-15 09:13:43.848512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:35.605 [2024-07-15 09:13:43.848532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e9e4d0 00:09:35.605 [2024-07-15 09:13:43.848545] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:35.605 [2024-07-15 09:13:43.850182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:35.605 [2024-07-15 09:13:43.850213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:35.605 Running I/O for 5 seconds... 00:09:40.894 00:09:40.894 Latency(us) 00:09:40.894 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:40.894 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x1000 00:09:40.894 Malloc0 : 5.07 1034.23 4.04 0.00 0.00 123495.53 548.51 282659.62 00:09:40.894 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x1000 length 0x1000 00:09:40.894 Malloc0 : 5.23 1003.61 3.92 0.00 0.00 127262.64 527.14 443137.34 00:09:40.894 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x800 00:09:40.894 Malloc1p0 : 5.29 532.76 2.08 0.00 0.00 238744.63 3405.02 266247.12 00:09:40.894 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x800 length 0x800 00:09:40.894 Malloc1p0 : 5.28 533.28 2.08 0.00 0.00 238565.77 3390.78 251658.24 00:09:40.894 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x800 00:09:40.894 Malloc1p1 : 5.29 532.20 2.08 0.00 0.00 238203.21 3490.50 260776.29 00:09:40.894 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x800 length 0x800 00:09:40.894 Malloc1p1 : 5.28 533.04 2.08 0.00 0.00 237842.55 3490.50 246187.41 00:09:40.894 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p0 : 5.30 531.68 2.08 0.00 0.00 237656.25 3462.01 251658.24 00:09:40.894 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p0 : 5.29 532.76 2.08 0.00 0.00 237162.13 3433.52 242540.19 00:09:40.894 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p1 : 5.30 531.18 2.07 0.00 0.00 237088.73 3604.48 246187.41 00:09:40.894 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p1 : 5.29 532.21 2.08 0.00 0.00 236623.40 3575.99 231598.53 00:09:40.894 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p2 : 5.31 530.70 2.07 0.00 0.00 236531.19 3575.99 244363.80 00:09:40.894 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p2 : 5.30 531.70 2.08 0.00 0.00 236075.98 3533.25 226127.69 00:09:40.894 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p3 : 5.31 530.23 2.07 0.00 0.00 235963.93 3419.27 238892.97 00:09:40.894 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p3 : 5.30 531.20 2.08 0.00 0.00 235502.41 3419.27 225215.89 00:09:40.894 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p4 : 5.32 529.78 2.07 0.00 0.00 235430.12 3547.49 235245.75 00:09:40.894 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p4 : 5.31 530.73 2.07 0.00 0.00 234989.65 3547.49 220656.86 00:09:40.894 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p5 : 5.32 529.35 2.07 0.00 0.00 234824.61 3519.00 230686.72 00:09:40.894 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p5 : 5.31 530.26 2.07 0.00 0.00 234408.39 3533.25 216097.84 00:09:40.894 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p6 : 5.32 528.92 2.07 0.00 0.00 234258.65 3504.75 226127.69 00:09:40.894 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p6 : 5.32 529.81 2.07 0.00 0.00 233836.22 3547.49 213362.42 00:09:40.894 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x200 00:09:40.894 Malloc2p7 : 5.33 528.64 2.06 0.00 0.00 233685.40 3333.79 219745.06 00:09:40.894 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x200 length 0x200 00:09:40.894 Malloc2p7 : 5.32 529.39 2.07 0.00 0.00 233301.50 3348.03 207891.59 00:09:40.894 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x1000 00:09:40.894 TestPT : 5.33 507.44 1.98 0.00 0.00 240206.09 20857.54 218833.25 00:09:40.894 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x1000 length 0x1000 00:09:40.894 TestPT : 5.34 505.46 1.97 0.00 0.00 242813.89 14132.98 291777.67 00:09:40.894 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x2000 00:09:40.894 raid0 : 5.33 528.05 2.06 0.00 0.00 232110.25 3348.03 195126.32 00:09:40.894 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x2000 length 0x2000 00:09:40.894 raid0 : 5.32 528.85 2.07 0.00 0.00 231818.59 3348.03 179625.63 00:09:40.894 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x2000 00:09:40.894 concat0 : 5.34 527.67 2.06 0.00 0.00 231526.22 3462.01 188743.68 00:09:40.894 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x2000 length 0x2000 00:09:40.894 concat0 : 5.33 528.57 2.06 0.00 0.00 231215.35 3490.50 176890.21 00:09:40.894 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x1000 00:09:40.894 raid1 : 5.34 527.25 2.06 0.00 0.00 230945.70 4131.62 187831.87 00:09:40.894 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x1000 length 0x1000 00:09:40.894 raid1 : 5.33 528.08 2.06 0.00 0.00 230613.01 4217.10 184184.65 00:09:40.894 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x0 length 0x4e2 00:09:40.894 AIO0 : 5.34 527.10 2.06 0.00 0.00 230245.96 1567.17 195126.32 00:09:40.894 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:40.894 Verification LBA range: start 0x4e2 length 0x4e2 00:09:40.894 AIO0 : 5.34 527.75 2.06 0.00 0.00 230022.48 1581.41 192390.90 00:09:40.894 =================================================================================================================== 00:09:40.894 Total : 17893.86 69.90 0.00 0.00 222906.02 527.14 443137.34 00:09:41.153 00:09:41.153 real 0m6.613s 00:09:41.153 user 0m12.257s 00:09:41.153 sys 0m0.395s 00:09:41.153 09:13:49 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:41.153 09:13:49 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:41.153 ************************************ 00:09:41.153 END TEST bdev_verify 00:09:41.153 ************************************ 00:09:41.153 09:13:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:41.153 09:13:49 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:41.153 09:13:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:41.153 09:13:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:41.153 09:13:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.153 ************************************ 00:09:41.153 START TEST bdev_verify_big_io 00:09:41.153 ************************************ 00:09:41.153 09:13:49 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:41.153 [2024-07-15 09:13:50.060460] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:41.153 [2024-07-15 09:13:50.060525] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73835 ] 00:09:41.411 [2024-07-15 09:13:50.186252] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:41.411 [2024-07-15 09:13:50.287331] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:41.411 [2024-07-15 09:13:50.287337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.669 [2024-07-15 09:13:50.447758] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:41.669 [2024-07-15 09:13:50.447821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:41.669 [2024-07-15 09:13:50.447836] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:41.669 [2024-07-15 09:13:50.455768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:41.669 [2024-07-15 09:13:50.455798] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:41.669 [2024-07-15 09:13:50.463780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:41.669 [2024-07-15 09:13:50.463805] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:41.669 [2024-07-15 09:13:50.540970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:41.669 [2024-07-15 09:13:50.541024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:41.669 [2024-07-15 09:13:50.541044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcde4d0 00:09:41.669 [2024-07-15 09:13:50.541057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:41.669 [2024-07-15 09:13:50.542691] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:41.669 [2024-07-15 09:13:50.542722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:41.926 [2024-07-15 09:13:50.708299] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.709500] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.711267] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.712357] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.714004] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.715069] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.716688] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:41.926 [2024-07-15 09:13:50.718337] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.719392] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.721036] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.721988] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.723389] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.724272] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.725687] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.726573] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.727979] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:41.927 [2024-07-15 09:13:50.751920] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:41.927 [2024-07-15 09:13:50.753920] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:41.927 Running I/O for 5 seconds... 00:09:50.020 00:09:50.020 Latency(us) 00:09:50.020 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:50.020 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x100 00:09:50.020 Malloc0 : 5.98 171.15 10.70 0.00 0.00 732807.81 869.06 1969499.27 00:09:50.020 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x100 length 0x100 00:09:50.020 Malloc0 : 5.97 150.11 9.38 0.00 0.00 836737.07 897.56 2290454.71 00:09:50.020 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x80 00:09:50.020 Malloc1p0 : 6.89 34.82 2.18 0.00 0.00 3303311.29 1474.56 5572953.49 00:09:50.020 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x80 length 0x80 00:09:50.020 Malloc1p0 : 6.27 87.38 5.46 0.00 0.00 1351625.96 2507.46 2698943.44 00:09:50.020 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x80 00:09:50.020 Malloc1p1 : 6.89 34.81 2.18 0.00 0.00 3196589.03 1495.93 5368709.12 00:09:50.020 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x80 length 0x80 00:09:50.020 Malloc1p1 : 6.74 35.63 2.23 0.00 0.00 3141340.22 1538.67 5397886.89 00:09:50.020 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p0 : 6.21 23.18 1.45 0.00 0.00 1209499.84 619.74 2159154.75 00:09:50.020 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p0 : 6.19 23.25 1.45 0.00 0.00 1211695.60 626.87 1984088.15 00:09:50.020 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p1 : 6.21 23.18 1.45 0.00 0.00 1198146.35 641.11 2129976.99 00:09:50.020 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p1 : 6.20 23.24 1.45 0.00 0.00 1201664.36 641.11 1954910.39 00:09:50.020 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p2 : 6.21 23.17 1.45 0.00 0.00 1186844.65 641.11 2100799.22 00:09:50.020 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p2 : 6.20 23.24 1.45 0.00 0.00 1191162.60 658.92 1925732.62 00:09:50.020 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p3 : 6.34 25.25 1.58 0.00 0.00 1092786.90 633.99 2071621.45 00:09:50.020 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p3 : 6.27 25.50 1.59 0.00 0.00 1091889.25 648.24 1911143.74 00:09:50.020 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p4 : 6.34 25.25 1.58 0.00 0.00 1083103.58 644.67 2042443.69 00:09:50.020 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p4 : 6.27 25.50 1.59 0.00 0.00 1082506.94 651.80 1881965.97 00:09:50.020 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p5 : 6.34 25.24 1.58 0.00 0.00 1072796.17 644.67 2013265.92 00:09:50.020 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p5 : 6.28 25.49 1.59 0.00 0.00 1073122.94 655.36 1860082.64 00:09:50.020 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p6 : 6.34 25.24 1.58 0.00 0.00 1062532.77 648.24 1998677.04 00:09:50.020 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p6 : 6.28 25.49 1.59 0.00 0.00 1062984.20 644.67 1830904.88 00:09:50.020 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x20 00:09:50.020 Malloc2p7 : 6.34 25.23 1.58 0.00 0.00 1052224.57 648.24 1969499.27 00:09:50.020 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x20 length 0x20 00:09:50.020 Malloc2p7 : 6.28 25.48 1.59 0.00 0.00 1053462.85 655.36 1809021.55 00:09:50.020 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x100 00:09:50.020 TestPT : 6.93 36.94 2.31 0.00 0.00 2748177.77 1488.81 4901864.85 00:09:50.020 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x100 length 0x100 00:09:50.020 TestPT : 6.81 33.18 2.07 0.00 0.00 3063837.00 105313.50 3486743.15 00:09:50.020 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x200 00:09:50.020 raid0 : 6.91 41.70 2.61 0.00 0.00 2416627.47 1609.91 4726798.25 00:09:50.020 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x200 length 0x200 00:09:50.020 raid0 : 6.86 39.64 2.48 0.00 0.00 2495040.73 1595.66 4755976.01 00:09:50.020 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x200 00:09:50.020 concat0 : 6.92 43.93 2.75 0.00 0.00 2225080.07 1588.54 4551731.65 00:09:50.020 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x200 length 0x200 00:09:50.020 concat0 : 6.74 49.85 3.12 0.00 0.00 1967224.49 1609.91 4580909.41 00:09:50.020 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x100 00:09:50.020 raid1 : 6.90 69.43 4.34 0.00 0.00 1369784.42 1994.57 4347487.28 00:09:50.020 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x100 length 0x100 00:09:50.020 raid1 : 6.81 56.21 3.51 0.00 0.00 1705084.35 2037.31 4405842.81 00:09:50.020 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x0 length 0x4e 00:09:50.020 AIO0 : 6.90 59.68 3.73 0.00 0.00 941573.74 787.14 2874010.05 00:09:50.020 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:50.020 Verification LBA range: start 0x4e length 0x4e 00:09:50.020 AIO0 : 6.86 64.53 4.03 0.00 0.00 885749.35 459.46 2859421.16 00:09:50.020 =================================================================================================================== 00:09:50.021 Total : 1401.92 87.62 0.00 0.00 1492127.45 459.46 5572953.49 00:09:50.021 00:09:50.021 real 0m8.224s 00:09:50.021 user 0m15.479s 00:09:50.021 sys 0m0.423s 00:09:50.021 09:13:58 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:50.021 09:13:58 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:50.021 ************************************ 00:09:50.021 END TEST bdev_verify_big_io 00:09:50.021 ************************************ 00:09:50.021 09:13:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:50.021 09:13:58 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:50.021 09:13:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:50.021 09:13:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:50.021 09:13:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:50.021 ************************************ 00:09:50.021 START TEST bdev_write_zeroes 00:09:50.021 ************************************ 00:09:50.021 09:13:58 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:50.021 [2024-07-15 09:13:58.371447] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:50.021 [2024-07-15 09:13:58.371510] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74904 ] 00:09:50.021 [2024-07-15 09:13:58.501205] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:50.021 [2024-07-15 09:13:58.603750] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.021 [2024-07-15 09:13:58.762355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:50.021 [2024-07-15 09:13:58.762424] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:50.021 [2024-07-15 09:13:58.762440] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:50.021 [2024-07-15 09:13:58.770355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:50.021 [2024-07-15 09:13:58.770384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:50.021 [2024-07-15 09:13:58.778362] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:50.021 [2024-07-15 09:13:58.778388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:50.021 [2024-07-15 09:13:58.855513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:50.021 [2024-07-15 09:13:58.855570] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:50.021 [2024-07-15 09:13:58.855589] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2065c10 00:09:50.021 [2024-07-15 09:13:58.855602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:50.021 [2024-07-15 09:13:58.857119] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:50.021 [2024-07-15 09:13:58.857151] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:50.279 Running I/O for 1 seconds... 00:09:51.212 00:09:51.212 Latency(us) 00:09:51.212 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:51.212 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc0 : 1.03 4983.02 19.46 0.00 0.00 25675.48 662.48 42854.85 00:09:51.212 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc1p0 : 1.03 4975.49 19.44 0.00 0.00 25668.24 911.81 41943.04 00:09:51.212 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc1p1 : 1.05 4979.50 19.45 0.00 0.00 25591.49 904.68 41031.23 00:09:51.212 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p0 : 1.06 4972.39 19.42 0.00 0.00 25567.37 904.68 40119.43 00:09:51.212 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p1 : 1.06 4965.43 19.40 0.00 0.00 25543.14 904.68 39435.58 00:09:51.212 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p2 : 1.06 4958.47 19.37 0.00 0.00 25525.21 904.68 38523.77 00:09:51.212 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p3 : 1.06 4951.48 19.34 0.00 0.00 25507.07 901.12 37611.97 00:09:51.212 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p4 : 1.06 4944.58 19.31 0.00 0.00 25486.28 904.68 36700.16 00:09:51.212 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.212 Malloc2p5 : 1.06 4937.70 19.29 0.00 0.00 25464.34 901.12 35788.35 00:09:51.212 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 Malloc2p6 : 1.06 4930.77 19.26 0.00 0.00 25447.94 904.68 34876.55 00:09:51.213 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 Malloc2p7 : 1.07 4923.93 19.23 0.00 0.00 25425.29 901.12 33964.74 00:09:51.213 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 TestPT : 1.07 4917.09 19.21 0.00 0.00 25407.96 940.30 33052.94 00:09:51.213 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 raid0 : 1.07 4909.19 19.18 0.00 0.00 25378.01 1609.91 31457.28 00:09:51.213 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 concat0 : 1.07 4901.45 19.15 0.00 0.00 25321.97 1588.54 29861.62 00:09:51.213 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 raid1 : 1.07 4891.79 19.11 0.00 0.00 25259.37 2564.45 27240.18 00:09:51.213 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:51.213 AIO0 : 1.07 4885.89 19.09 0.00 0.00 25171.35 1040.03 26898.25 00:09:51.213 =================================================================================================================== 00:09:51.213 Total : 79028.18 308.70 0.00 0.00 25464.40 662.48 42854.85 00:09:51.779 00:09:51.779 real 0m2.249s 00:09:51.779 user 0m1.835s 00:09:51.779 sys 0m0.364s 00:09:51.779 09:14:00 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:51.779 09:14:00 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:51.779 ************************************ 00:09:51.779 END TEST bdev_write_zeroes 00:09:51.779 ************************************ 00:09:51.779 09:14:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:51.779 09:14:00 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:51.779 09:14:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:51.779 09:14:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:51.779 09:14:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:51.779 ************************************ 00:09:51.779 START TEST bdev_json_nonenclosed 00:09:51.779 ************************************ 00:09:51.779 09:14:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:51.779 [2024-07-15 09:14:00.702417] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:51.779 [2024-07-15 09:14:00.702479] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75187 ] 00:09:52.037 [2024-07-15 09:14:00.833658] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.037 [2024-07-15 09:14:00.933237] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.037 [2024-07-15 09:14:00.933312] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:52.037 [2024-07-15 09:14:00.933333] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:52.037 [2024-07-15 09:14:00.933346] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:52.295 00:09:52.295 real 0m0.397s 00:09:52.295 user 0m0.239s 00:09:52.295 sys 0m0.155s 00:09:52.295 09:14:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:52.295 09:14:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.295 09:14:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:52.295 ************************************ 00:09:52.295 END TEST bdev_json_nonenclosed 00:09:52.295 ************************************ 00:09:52.295 09:14:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:52.295 09:14:01 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:52.295 09:14:01 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:52.295 09:14:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:52.295 09:14:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.295 09:14:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:52.295 ************************************ 00:09:52.295 START TEST bdev_json_nonarray 00:09:52.295 ************************************ 00:09:52.295 09:14:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:52.295 [2024-07-15 09:14:01.194917] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:52.295 [2024-07-15 09:14:01.194988] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75299 ] 00:09:52.553 [2024-07-15 09:14:01.323908] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.553 [2024-07-15 09:14:01.420729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.553 [2024-07-15 09:14:01.420807] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:52.553 [2024-07-15 09:14:01.420827] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:52.553 [2024-07-15 09:14:01.420841] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:52.811 00:09:52.811 real 0m0.389s 00:09:52.811 user 0m0.233s 00:09:52.811 sys 0m0.153s 00:09:52.811 09:14:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:52.811 09:14:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.811 09:14:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:52.811 ************************************ 00:09:52.811 END TEST bdev_json_nonarray 00:09:52.811 ************************************ 00:09:52.811 09:14:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:52.811 09:14:01 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:52.811 09:14:01 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:52.811 09:14:01 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:52.811 09:14:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:52.811 09:14:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.811 09:14:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:52.811 ************************************ 00:09:52.811 START TEST bdev_qos 00:09:52.811 ************************************ 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=75321 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 75321' 00:09:52.811 Process qos testing pid: 75321 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 75321 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 75321 ']' 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:52.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:52.811 09:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:52.811 [2024-07-15 09:14:01.670816] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:09:52.811 [2024-07-15 09:14:01.670886] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75321 ] 00:09:53.069 [2024-07-15 09:14:01.790204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:53.069 [2024-07-15 09:14:01.896533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.002 Malloc_0 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.002 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.002 [ 00:09:54.002 { 00:09:54.002 "name": "Malloc_0", 00:09:54.002 "aliases": [ 00:09:54.002 "59f6dba4-8550-4933-a450-9f7a096ad4b0" 00:09:54.002 ], 00:09:54.002 "product_name": "Malloc disk", 00:09:54.002 "block_size": 512, 00:09:54.002 "num_blocks": 262144, 00:09:54.002 "uuid": "59f6dba4-8550-4933-a450-9f7a096ad4b0", 00:09:54.002 "assigned_rate_limits": { 00:09:54.002 "rw_ios_per_sec": 0, 00:09:54.002 "rw_mbytes_per_sec": 0, 00:09:54.003 "r_mbytes_per_sec": 0, 00:09:54.003 "w_mbytes_per_sec": 0 00:09:54.003 }, 00:09:54.003 "claimed": false, 00:09:54.003 "zoned": false, 00:09:54.003 "supported_io_types": { 00:09:54.003 "read": true, 00:09:54.003 "write": true, 00:09:54.003 "unmap": true, 00:09:54.003 "flush": true, 00:09:54.003 "reset": true, 00:09:54.003 "nvme_admin": false, 00:09:54.003 "nvme_io": false, 00:09:54.003 "nvme_io_md": false, 00:09:54.003 "write_zeroes": true, 00:09:54.003 "zcopy": true, 00:09:54.003 "get_zone_info": false, 00:09:54.003 "zone_management": false, 00:09:54.003 "zone_append": false, 00:09:54.003 "compare": false, 00:09:54.003 "compare_and_write": false, 00:09:54.003 "abort": true, 00:09:54.003 "seek_hole": false, 00:09:54.003 "seek_data": false, 00:09:54.003 "copy": true, 00:09:54.003 "nvme_iov_md": false 00:09:54.003 }, 00:09:54.003 "memory_domains": [ 00:09:54.003 { 00:09:54.003 "dma_device_id": "system", 00:09:54.003 "dma_device_type": 1 00:09:54.003 }, 00:09:54.003 { 00:09:54.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:54.003 "dma_device_type": 2 00:09:54.003 } 00:09:54.003 ], 00:09:54.003 "driver_specific": {} 00:09:54.003 } 00:09:54.003 ] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.003 Null_1 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.003 [ 00:09:54.003 { 00:09:54.003 "name": "Null_1", 00:09:54.003 "aliases": [ 00:09:54.003 "7732f01c-1962-4103-beee-278d2a4aba0c" 00:09:54.003 ], 00:09:54.003 "product_name": "Null disk", 00:09:54.003 "block_size": 512, 00:09:54.003 "num_blocks": 262144, 00:09:54.003 "uuid": "7732f01c-1962-4103-beee-278d2a4aba0c", 00:09:54.003 "assigned_rate_limits": { 00:09:54.003 "rw_ios_per_sec": 0, 00:09:54.003 "rw_mbytes_per_sec": 0, 00:09:54.003 "r_mbytes_per_sec": 0, 00:09:54.003 "w_mbytes_per_sec": 0 00:09:54.003 }, 00:09:54.003 "claimed": false, 00:09:54.003 "zoned": false, 00:09:54.003 "supported_io_types": { 00:09:54.003 "read": true, 00:09:54.003 "write": true, 00:09:54.003 "unmap": false, 00:09:54.003 "flush": false, 00:09:54.003 "reset": true, 00:09:54.003 "nvme_admin": false, 00:09:54.003 "nvme_io": false, 00:09:54.003 "nvme_io_md": false, 00:09:54.003 "write_zeroes": true, 00:09:54.003 "zcopy": false, 00:09:54.003 "get_zone_info": false, 00:09:54.003 "zone_management": false, 00:09:54.003 "zone_append": false, 00:09:54.003 "compare": false, 00:09:54.003 "compare_and_write": false, 00:09:54.003 "abort": true, 00:09:54.003 "seek_hole": false, 00:09:54.003 "seek_data": false, 00:09:54.003 "copy": false, 00:09:54.003 "nvme_iov_md": false 00:09:54.003 }, 00:09:54.003 "driver_specific": {} 00:09:54.003 } 00:09:54.003 ] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:54.003 09:14:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:54.003 Running I/O for 60 seconds... 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 63035.02 252140.09 0.00 0.00 252928.00 0.00 0.00 ' 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=63035.02 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 63035 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=63035 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:59.260 09:14:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:59.260 ************************************ 00:09:59.260 START TEST bdev_qos_iops 00:09:59.260 ************************************ 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:59.260 09:14:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14996.25 59984.98 0.00 0.00 60900.00 0.00 0.00 ' 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14996.25 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14996 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14996 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14996 -lt 13500 ']' 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14996 -gt 16500 ']' 00:10:04.612 00:10:04.612 real 0m5.247s 00:10:04.612 user 0m0.112s 00:10:04.612 sys 0m0.050s 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:04.612 09:14:13 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:04.612 ************************************ 00:10:04.612 END TEST bdev_qos_iops 00:10:04.612 ************************************ 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:04.612 09:14:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20183.65 80734.60 0.00 0.00 81920.00 0.00 0.00 ' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.876 09:14:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:09.876 ************************************ 00:10:09.876 START TEST bdev_qos_bw 00:10:09.876 ************************************ 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:09.876 09:14:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2046.73 8186.91 0.00 0.00 8412.00 0.00 0.00 ' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8412.00 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8412 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8412 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8412 -lt 7372 ']' 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8412 -gt 9011 ']' 00:10:15.144 00:10:15.144 real 0m5.262s 00:10:15.144 user 0m0.079s 00:10:15.144 sys 0m0.049s 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:15.144 ************************************ 00:10:15.144 END TEST bdev_qos_bw 00:10:15.144 ************************************ 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:15.144 09:14:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.144 ************************************ 00:10:15.144 START TEST bdev_qos_ro_bw 00:10:15.144 ************************************ 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:15.144 09:14:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.78 2047.12 0.00 0.00 2060.00 0.00 0.00 ' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:20.411 00:10:20.411 real 0m5.184s 00:10:20.411 user 0m0.107s 00:10:20.411 sys 0m0.053s 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:20.411 09:14:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:20.411 ************************************ 00:10:20.411 END TEST bdev_qos_ro_bw 00:10:20.411 ************************************ 00:10:20.411 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:20.411 09:14:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:20.411 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.411 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:20.977 00:10:20.977 Latency(us) 00:10:20.977 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.977 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:20.977 Malloc_0 : 26.79 20953.24 81.85 0.00 0.00 12105.20 1994.57 503316.48 00:10:20.977 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:20.977 Null_1 : 26.94 20635.25 80.61 0.00 0.00 12374.52 804.95 151359.67 00:10:20.977 =================================================================================================================== 00:10:20.977 Total : 41588.49 162.46 0.00 0.00 12239.21 804.95 503316.48 00:10:20.977 0 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 75321 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 75321 ']' 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 75321 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 75321 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 75321' 00:10:20.977 killing process with pid 75321 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 75321 00:10:20.977 Received shutdown signal, test time was about 27.001094 seconds 00:10:20.977 00:10:20.977 Latency(us) 00:10:20.977 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.977 =================================================================================================================== 00:10:20.977 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:20.977 09:14:29 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 75321 00:10:21.235 09:14:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:21.235 00:10:21.235 real 0m28.489s 00:10:21.235 user 0m29.197s 00:10:21.235 sys 0m0.882s 00:10:21.235 09:14:30 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:21.235 09:14:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:21.235 ************************************ 00:10:21.235 END TEST bdev_qos 00:10:21.235 ************************************ 00:10:21.235 09:14:30 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:21.235 09:14:30 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:21.235 09:14:30 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:21.235 09:14:30 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:21.235 09:14:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:21.235 ************************************ 00:10:21.235 START TEST bdev_qd_sampling 00:10:21.235 ************************************ 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=79107 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 79107' 00:10:21.235 Process bdev QD sampling period testing pid: 79107 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 79107 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 79107 ']' 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:21.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:21.235 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:21.492 [2024-07-15 09:14:30.235741] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:21.492 [2024-07-15 09:14:30.235805] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid79107 ] 00:10:21.492 [2024-07-15 09:14:30.366110] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:21.749 [2024-07-15 09:14:30.476483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:21.749 [2024-07-15 09:14:30.476490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:21.749 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:21.749 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:21.749 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:21.749 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.749 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:22.006 Malloc_QD 00:10:22.006 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:22.007 [ 00:10:22.007 { 00:10:22.007 "name": "Malloc_QD", 00:10:22.007 "aliases": [ 00:10:22.007 "4bff5b23-e014-47e3-8606-f723c774c4af" 00:10:22.007 ], 00:10:22.007 "product_name": "Malloc disk", 00:10:22.007 "block_size": 512, 00:10:22.007 "num_blocks": 262144, 00:10:22.007 "uuid": "4bff5b23-e014-47e3-8606-f723c774c4af", 00:10:22.007 "assigned_rate_limits": { 00:10:22.007 "rw_ios_per_sec": 0, 00:10:22.007 "rw_mbytes_per_sec": 0, 00:10:22.007 "r_mbytes_per_sec": 0, 00:10:22.007 "w_mbytes_per_sec": 0 00:10:22.007 }, 00:10:22.007 "claimed": false, 00:10:22.007 "zoned": false, 00:10:22.007 "supported_io_types": { 00:10:22.007 "read": true, 00:10:22.007 "write": true, 00:10:22.007 "unmap": true, 00:10:22.007 "flush": true, 00:10:22.007 "reset": true, 00:10:22.007 "nvme_admin": false, 00:10:22.007 "nvme_io": false, 00:10:22.007 "nvme_io_md": false, 00:10:22.007 "write_zeroes": true, 00:10:22.007 "zcopy": true, 00:10:22.007 "get_zone_info": false, 00:10:22.007 "zone_management": false, 00:10:22.007 "zone_append": false, 00:10:22.007 "compare": false, 00:10:22.007 "compare_and_write": false, 00:10:22.007 "abort": true, 00:10:22.007 "seek_hole": false, 00:10:22.007 "seek_data": false, 00:10:22.007 "copy": true, 00:10:22.007 "nvme_iov_md": false 00:10:22.007 }, 00:10:22.007 "memory_domains": [ 00:10:22.007 { 00:10:22.007 "dma_device_id": "system", 00:10:22.007 "dma_device_type": 1 00:10:22.007 }, 00:10:22.007 { 00:10:22.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.007 "dma_device_type": 2 00:10:22.007 } 00:10:22.007 ], 00:10:22.007 "driver_specific": {} 00:10:22.007 } 00:10:22.007 ] 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:22.007 09:14:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:22.007 Running I/O for 5 seconds... 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:23.945 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:23.946 "tick_rate": 2300000000, 00:10:23.946 "ticks": 5324619506810878, 00:10:23.946 "bdevs": [ 00:10:23.946 { 00:10:23.946 "name": "Malloc_QD", 00:10:23.946 "bytes_read": 771797504, 00:10:23.946 "num_read_ops": 188420, 00:10:23.946 "bytes_written": 0, 00:10:23.946 "num_write_ops": 0, 00:10:23.946 "bytes_unmapped": 0, 00:10:23.946 "num_unmap_ops": 0, 00:10:23.946 "bytes_copied": 0, 00:10:23.946 "num_copy_ops": 0, 00:10:23.946 "read_latency_ticks": 2240473577116, 00:10:23.946 "max_read_latency_ticks": 14389316, 00:10:23.946 "min_read_latency_ticks": 267846, 00:10:23.946 "write_latency_ticks": 0, 00:10:23.946 "max_write_latency_ticks": 0, 00:10:23.946 "min_write_latency_ticks": 0, 00:10:23.946 "unmap_latency_ticks": 0, 00:10:23.946 "max_unmap_latency_ticks": 0, 00:10:23.946 "min_unmap_latency_ticks": 0, 00:10:23.946 "copy_latency_ticks": 0, 00:10:23.946 "max_copy_latency_ticks": 0, 00:10:23.946 "min_copy_latency_ticks": 0, 00:10:23.946 "io_error": {}, 00:10:23.946 "queue_depth_polling_period": 10, 00:10:23.946 "queue_depth": 512, 00:10:23.946 "io_time": 30, 00:10:23.946 "weighted_io_time": 15360 00:10:23.946 } 00:10:23.946 ] 00:10:23.946 }' 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.946 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:23.946 00:10:23.946 Latency(us) 00:10:23.946 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:23.946 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:23.946 Malloc_QD : 1.99 48908.53 191.05 0.00 0.00 5221.33 1417.57 5527.82 00:10:23.946 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:23.946 Malloc_QD : 1.99 50019.94 195.39 0.00 0.00 5105.91 954.55 6268.66 00:10:23.946 =================================================================================================================== 00:10:23.946 Total : 98928.47 386.44 0.00 0.00 5162.94 954.55 6268.66 00:10:24.203 0 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 79107 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 79107 ']' 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 79107 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 79107 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 79107' 00:10:24.203 killing process with pid 79107 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 79107 00:10:24.203 Received shutdown signal, test time was about 2.068596 seconds 00:10:24.203 00:10:24.203 Latency(us) 00:10:24.203 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:24.203 =================================================================================================================== 00:10:24.203 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:24.203 09:14:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 79107 00:10:24.461 09:14:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:24.461 00:10:24.461 real 0m2.994s 00:10:24.461 user 0m5.885s 00:10:24.461 sys 0m0.433s 00:10:24.461 09:14:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:24.461 09:14:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:24.461 ************************************ 00:10:24.461 END TEST bdev_qd_sampling 00:10:24.461 ************************************ 00:10:24.461 09:14:33 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:24.461 09:14:33 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:24.461 09:14:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:24.461 09:14:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:24.461 09:14:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:24.461 ************************************ 00:10:24.461 START TEST bdev_error 00:10:24.461 ************************************ 00:10:24.461 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:24.461 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:24.461 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:24.461 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:24.461 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=79510 00:10:24.461 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 79510' 00:10:24.461 Process error testing pid: 79510 00:10:24.462 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 79510 00:10:24.462 09:14:33 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 79510 ']' 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:24.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:24.462 09:14:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:24.462 [2024-07-15 09:14:33.320753] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:24.462 [2024-07-15 09:14:33.320820] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid79510 ] 00:10:24.720 [2024-07-15 09:14:33.440508] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:24.720 [2024-07-15 09:14:33.546314] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:25.655 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 Dev_1 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 [ 00:10:25.656 { 00:10:25.656 "name": "Dev_1", 00:10:25.656 "aliases": [ 00:10:25.656 "13975e6c-4e6f-46e5-ac8f-ea696ac4fcba" 00:10:25.656 ], 00:10:25.656 "product_name": "Malloc disk", 00:10:25.656 "block_size": 512, 00:10:25.656 "num_blocks": 262144, 00:10:25.656 "uuid": "13975e6c-4e6f-46e5-ac8f-ea696ac4fcba", 00:10:25.656 "assigned_rate_limits": { 00:10:25.656 "rw_ios_per_sec": 0, 00:10:25.656 "rw_mbytes_per_sec": 0, 00:10:25.656 "r_mbytes_per_sec": 0, 00:10:25.656 "w_mbytes_per_sec": 0 00:10:25.656 }, 00:10:25.656 "claimed": false, 00:10:25.656 "zoned": false, 00:10:25.656 "supported_io_types": { 00:10:25.656 "read": true, 00:10:25.656 "write": true, 00:10:25.656 "unmap": true, 00:10:25.656 "flush": true, 00:10:25.656 "reset": true, 00:10:25.656 "nvme_admin": false, 00:10:25.656 "nvme_io": false, 00:10:25.656 "nvme_io_md": false, 00:10:25.656 "write_zeroes": true, 00:10:25.656 "zcopy": true, 00:10:25.656 "get_zone_info": false, 00:10:25.656 "zone_management": false, 00:10:25.656 "zone_append": false, 00:10:25.656 "compare": false, 00:10:25.656 "compare_and_write": false, 00:10:25.656 "abort": true, 00:10:25.656 "seek_hole": false, 00:10:25.656 "seek_data": false, 00:10:25.656 "copy": true, 00:10:25.656 "nvme_iov_md": false 00:10:25.656 }, 00:10:25.656 "memory_domains": [ 00:10:25.656 { 00:10:25.656 "dma_device_id": "system", 00:10:25.656 "dma_device_type": 1 00:10:25.656 }, 00:10:25.656 { 00:10:25.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.656 "dma_device_type": 2 00:10:25.656 } 00:10:25.656 ], 00:10:25.656 "driver_specific": {} 00:10:25.656 } 00:10:25.656 ] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 true 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 Dev_2 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 [ 00:10:25.656 { 00:10:25.656 "name": "Dev_2", 00:10:25.656 "aliases": [ 00:10:25.656 "c1a86246-5ea6-4240-91f7-3e324a44cac0" 00:10:25.656 ], 00:10:25.656 "product_name": "Malloc disk", 00:10:25.656 "block_size": 512, 00:10:25.656 "num_blocks": 262144, 00:10:25.656 "uuid": "c1a86246-5ea6-4240-91f7-3e324a44cac0", 00:10:25.656 "assigned_rate_limits": { 00:10:25.656 "rw_ios_per_sec": 0, 00:10:25.656 "rw_mbytes_per_sec": 0, 00:10:25.656 "r_mbytes_per_sec": 0, 00:10:25.656 "w_mbytes_per_sec": 0 00:10:25.656 }, 00:10:25.656 "claimed": false, 00:10:25.656 "zoned": false, 00:10:25.656 "supported_io_types": { 00:10:25.656 "read": true, 00:10:25.656 "write": true, 00:10:25.656 "unmap": true, 00:10:25.656 "flush": true, 00:10:25.656 "reset": true, 00:10:25.656 "nvme_admin": false, 00:10:25.656 "nvme_io": false, 00:10:25.656 "nvme_io_md": false, 00:10:25.656 "write_zeroes": true, 00:10:25.656 "zcopy": true, 00:10:25.656 "get_zone_info": false, 00:10:25.656 "zone_management": false, 00:10:25.656 "zone_append": false, 00:10:25.656 "compare": false, 00:10:25.656 "compare_and_write": false, 00:10:25.656 "abort": true, 00:10:25.656 "seek_hole": false, 00:10:25.656 "seek_data": false, 00:10:25.656 "copy": true, 00:10:25.656 "nvme_iov_md": false 00:10:25.656 }, 00:10:25.656 "memory_domains": [ 00:10:25.656 { 00:10:25.656 "dma_device_id": "system", 00:10:25.656 "dma_device_type": 1 00:10:25.656 }, 00:10:25.656 { 00:10:25.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.656 "dma_device_type": 2 00:10:25.656 } 00:10:25.656 ], 00:10:25.656 "driver_specific": {} 00:10:25.656 } 00:10:25.656 ] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:25.656 09:14:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:25.656 09:14:34 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:25.656 Running I/O for 5 seconds... 00:10:26.591 09:14:35 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 79510 00:10:26.591 09:14:35 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 79510' 00:10:26.591 Process is existed as continue on error is set. Pid: 79510 00:10:26.591 09:14:35 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:26.591 09:14:35 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:26.591 09:14:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:26.591 09:14:35 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:26.591 Timeout while waiting for response: 00:10:26.591 00:10:26.591 00:10:30.846 00:10:30.846 Latency(us) 00:10:30.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:30.846 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:30.846 EE_Dev_1 : 0.89 37482.01 146.41 5.61 0.00 423.32 131.78 683.85 00:10:30.846 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:30.846 Dev_2 : 5.00 81193.91 317.16 0.00 0.00 193.55 75.69 22567.18 00:10:30.846 =================================================================================================================== 00:10:30.846 Total : 118675.92 463.58 5.61 0.00 211.04 75.69 22567.18 00:10:31.781 09:14:40 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 79510 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 79510 ']' 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 79510 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 79510 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 79510' 00:10:31.781 killing process with pid 79510 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 79510 00:10:31.781 Received shutdown signal, test time was about 5.000000 seconds 00:10:31.781 00:10:31.781 Latency(us) 00:10:31.781 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:31.781 =================================================================================================================== 00:10:31.781 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:31.781 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 79510 00:10:32.039 09:14:40 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=80549 00:10:32.039 09:14:40 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 80549' 00:10:32.039 Process error testing pid: 80549 00:10:32.039 09:14:40 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:32.039 09:14:40 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 80549 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 80549 ']' 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:32.039 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:32.039 09:14:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:32.039 [2024-07-15 09:14:40.845922] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:32.039 [2024-07-15 09:14:40.846010] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid80549 ] 00:10:32.039 [2024-07-15 09:14:40.964525] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:32.297 [2024-07-15 09:14:41.072221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:32.864 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:32.864 Dev_1 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:32.864 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:32.864 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 [ 00:10:33.124 { 00:10:33.124 "name": "Dev_1", 00:10:33.124 "aliases": [ 00:10:33.124 "0807b004-dd58-4386-9748-f7b4976bc9ae" 00:10:33.124 ], 00:10:33.124 "product_name": "Malloc disk", 00:10:33.124 "block_size": 512, 00:10:33.124 "num_blocks": 262144, 00:10:33.124 "uuid": "0807b004-dd58-4386-9748-f7b4976bc9ae", 00:10:33.124 "assigned_rate_limits": { 00:10:33.124 "rw_ios_per_sec": 0, 00:10:33.124 "rw_mbytes_per_sec": 0, 00:10:33.124 "r_mbytes_per_sec": 0, 00:10:33.124 "w_mbytes_per_sec": 0 00:10:33.124 }, 00:10:33.124 "claimed": false, 00:10:33.124 "zoned": false, 00:10:33.124 "supported_io_types": { 00:10:33.124 "read": true, 00:10:33.124 "write": true, 00:10:33.124 "unmap": true, 00:10:33.124 "flush": true, 00:10:33.124 "reset": true, 00:10:33.124 "nvme_admin": false, 00:10:33.124 "nvme_io": false, 00:10:33.124 "nvme_io_md": false, 00:10:33.124 "write_zeroes": true, 00:10:33.124 "zcopy": true, 00:10:33.124 "get_zone_info": false, 00:10:33.124 "zone_management": false, 00:10:33.124 "zone_append": false, 00:10:33.124 "compare": false, 00:10:33.124 "compare_and_write": false, 00:10:33.124 "abort": true, 00:10:33.124 "seek_hole": false, 00:10:33.124 "seek_data": false, 00:10:33.124 "copy": true, 00:10:33.124 "nvme_iov_md": false 00:10:33.124 }, 00:10:33.124 "memory_domains": [ 00:10:33.124 { 00:10:33.124 "dma_device_id": "system", 00:10:33.124 "dma_device_type": 1 00:10:33.124 }, 00:10:33.124 { 00:10:33.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:33.124 "dma_device_type": 2 00:10:33.124 } 00:10:33.124 ], 00:10:33.124 "driver_specific": {} 00:10:33.124 } 00:10:33.124 ] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:33.124 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 true 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 Dev_2 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.124 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.124 [ 00:10:33.124 { 00:10:33.124 "name": "Dev_2", 00:10:33.124 "aliases": [ 00:10:33.124 "92779982-1384-49cf-b102-134e6d4bb762" 00:10:33.124 ], 00:10:33.124 "product_name": "Malloc disk", 00:10:33.124 "block_size": 512, 00:10:33.125 "num_blocks": 262144, 00:10:33.125 "uuid": "92779982-1384-49cf-b102-134e6d4bb762", 00:10:33.125 "assigned_rate_limits": { 00:10:33.125 "rw_ios_per_sec": 0, 00:10:33.125 "rw_mbytes_per_sec": 0, 00:10:33.125 "r_mbytes_per_sec": 0, 00:10:33.125 "w_mbytes_per_sec": 0 00:10:33.125 }, 00:10:33.125 "claimed": false, 00:10:33.125 "zoned": false, 00:10:33.125 "supported_io_types": { 00:10:33.125 "read": true, 00:10:33.125 "write": true, 00:10:33.125 "unmap": true, 00:10:33.125 "flush": true, 00:10:33.125 "reset": true, 00:10:33.125 "nvme_admin": false, 00:10:33.125 "nvme_io": false, 00:10:33.125 "nvme_io_md": false, 00:10:33.125 "write_zeroes": true, 00:10:33.125 "zcopy": true, 00:10:33.125 "get_zone_info": false, 00:10:33.125 "zone_management": false, 00:10:33.125 "zone_append": false, 00:10:33.125 "compare": false, 00:10:33.125 "compare_and_write": false, 00:10:33.125 "abort": true, 00:10:33.125 "seek_hole": false, 00:10:33.125 "seek_data": false, 00:10:33.125 "copy": true, 00:10:33.125 "nvme_iov_md": false 00:10:33.125 }, 00:10:33.125 "memory_domains": [ 00:10:33.125 { 00:10:33.125 "dma_device_id": "system", 00:10:33.125 "dma_device_type": 1 00:10:33.125 }, 00:10:33.125 { 00:10:33.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:33.125 "dma_device_type": 2 00:10:33.125 } 00:10:33.125 ], 00:10:33.125 "driver_specific": {} 00:10:33.125 } 00:10:33.125 ] 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:33.125 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.125 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 80549 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:33.125 09:14:41 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 80549 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:33.125 09:14:41 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 80549 00:10:33.125 Running I/O for 5 seconds... 00:10:33.125 task offset: 222128 on job bdev=EE_Dev_1 fails 00:10:33.125 00:10:33.125 Latency(us) 00:10:33.125 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.125 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:33.125 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:33.125 EE_Dev_1 : 0.00 29972.75 117.08 6811.99 0.00 361.90 129.11 644.67 00:10:33.125 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:33.125 Dev_2 : 0.00 18348.62 71.67 0.00 0.00 652.52 124.66 1218.11 00:10:33.125 =================================================================================================================== 00:10:33.125 Total : 48321.38 188.76 6811.99 0.00 519.52 124.66 1218.11 00:10:33.125 [2024-07-15 09:14:42.045093] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:33.125 request: 00:10:33.125 { 00:10:33.125 "method": "perform_tests", 00:10:33.125 "req_id": 1 00:10:33.125 } 00:10:33.125 Got JSON-RPC error response 00:10:33.125 response: 00:10:33.125 { 00:10:33.125 "code": -32603, 00:10:33.125 "message": "bdevperf failed with error Operation not permitted" 00:10:33.125 } 00:10:33.383 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:33.383 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:33.383 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:33.641 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:33.641 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:33.641 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:33.641 00:10:33.641 real 0m9.079s 00:10:33.641 user 0m9.456s 00:10:33.641 sys 0m0.906s 00:10:33.641 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.641 09:14:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:33.641 ************************************ 00:10:33.641 END TEST bdev_error 00:10:33.641 ************************************ 00:10:33.641 09:14:42 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:33.641 09:14:42 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:33.641 09:14:42 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:33.641 09:14:42 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.641 09:14:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:33.641 ************************************ 00:10:33.641 START TEST bdev_stat 00:10:33.641 ************************************ 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=80748 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 80748' 00:10:33.641 Process Bdev IO statistics testing pid: 80748 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 80748 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 80748 ']' 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:33.641 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:33.641 09:14:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.641 [2024-07-15 09:14:42.489439] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:33.641 [2024-07-15 09:14:42.489506] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid80748 ] 00:10:33.899 [2024-07-15 09:14:42.617804] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:33.899 [2024-07-15 09:14:42.725487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:33.899 [2024-07-15 09:14:42.725492] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.465 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:34.465 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:34.465 09:14:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:34.722 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.722 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:34.723 Malloc_STAT 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:34.723 [ 00:10:34.723 { 00:10:34.723 "name": "Malloc_STAT", 00:10:34.723 "aliases": [ 00:10:34.723 "4d349d9f-c1a9-49ee-9123-8a7fc2fb1c15" 00:10:34.723 ], 00:10:34.723 "product_name": "Malloc disk", 00:10:34.723 "block_size": 512, 00:10:34.723 "num_blocks": 262144, 00:10:34.723 "uuid": "4d349d9f-c1a9-49ee-9123-8a7fc2fb1c15", 00:10:34.723 "assigned_rate_limits": { 00:10:34.723 "rw_ios_per_sec": 0, 00:10:34.723 "rw_mbytes_per_sec": 0, 00:10:34.723 "r_mbytes_per_sec": 0, 00:10:34.723 "w_mbytes_per_sec": 0 00:10:34.723 }, 00:10:34.723 "claimed": false, 00:10:34.723 "zoned": false, 00:10:34.723 "supported_io_types": { 00:10:34.723 "read": true, 00:10:34.723 "write": true, 00:10:34.723 "unmap": true, 00:10:34.723 "flush": true, 00:10:34.723 "reset": true, 00:10:34.723 "nvme_admin": false, 00:10:34.723 "nvme_io": false, 00:10:34.723 "nvme_io_md": false, 00:10:34.723 "write_zeroes": true, 00:10:34.723 "zcopy": true, 00:10:34.723 "get_zone_info": false, 00:10:34.723 "zone_management": false, 00:10:34.723 "zone_append": false, 00:10:34.723 "compare": false, 00:10:34.723 "compare_and_write": false, 00:10:34.723 "abort": true, 00:10:34.723 "seek_hole": false, 00:10:34.723 "seek_data": false, 00:10:34.723 "copy": true, 00:10:34.723 "nvme_iov_md": false 00:10:34.723 }, 00:10:34.723 "memory_domains": [ 00:10:34.723 { 00:10:34.723 "dma_device_id": "system", 00:10:34.723 "dma_device_type": 1 00:10:34.723 }, 00:10:34.723 { 00:10:34.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.723 "dma_device_type": 2 00:10:34.723 } 00:10:34.723 ], 00:10:34.723 "driver_specific": {} 00:10:34.723 } 00:10:34.723 ] 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:34.723 09:14:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:34.723 Running I/O for 10 seconds... 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.621 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:36.621 "tick_rate": 2300000000, 00:10:36.621 "ticks": 5324648656067768, 00:10:36.621 "bdevs": [ 00:10:36.621 { 00:10:36.621 "name": "Malloc_STAT", 00:10:36.621 "bytes_read": 779137536, 00:10:36.621 "num_read_ops": 190212, 00:10:36.621 "bytes_written": 0, 00:10:36.621 "num_write_ops": 0, 00:10:36.621 "bytes_unmapped": 0, 00:10:36.621 "num_unmap_ops": 0, 00:10:36.621 "bytes_copied": 0, 00:10:36.621 "num_copy_ops": 0, 00:10:36.621 "read_latency_ticks": 2262876065560, 00:10:36.621 "max_read_latency_ticks": 14292602, 00:10:36.621 "min_read_latency_ticks": 284332, 00:10:36.621 "write_latency_ticks": 0, 00:10:36.621 "max_write_latency_ticks": 0, 00:10:36.621 "min_write_latency_ticks": 0, 00:10:36.621 "unmap_latency_ticks": 0, 00:10:36.621 "max_unmap_latency_ticks": 0, 00:10:36.621 "min_unmap_latency_ticks": 0, 00:10:36.621 "copy_latency_ticks": 0, 00:10:36.621 "max_copy_latency_ticks": 0, 00:10:36.621 "min_copy_latency_ticks": 0, 00:10:36.621 "io_error": {} 00:10:36.621 } 00:10:36.621 ] 00:10:36.621 }' 00:10:36.622 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:36.622 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=190212 00:10:36.622 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:36.622 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.622 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:36.880 "tick_rate": 2300000000, 00:10:36.880 "ticks": 5324648824428798, 00:10:36.880 "name": "Malloc_STAT", 00:10:36.880 "channels": [ 00:10:36.880 { 00:10:36.880 "thread_id": 2, 00:10:36.880 "bytes_read": 399507456, 00:10:36.880 "num_read_ops": 97536, 00:10:36.880 "bytes_written": 0, 00:10:36.880 "num_write_ops": 0, 00:10:36.880 "bytes_unmapped": 0, 00:10:36.880 "num_unmap_ops": 0, 00:10:36.880 "bytes_copied": 0, 00:10:36.880 "num_copy_ops": 0, 00:10:36.880 "read_latency_ticks": 1173723592844, 00:10:36.880 "max_read_latency_ticks": 12830106, 00:10:36.880 "min_read_latency_ticks": 7901838, 00:10:36.880 "write_latency_ticks": 0, 00:10:36.880 "max_write_latency_ticks": 0, 00:10:36.880 "min_write_latency_ticks": 0, 00:10:36.880 "unmap_latency_ticks": 0, 00:10:36.880 "max_unmap_latency_ticks": 0, 00:10:36.880 "min_unmap_latency_ticks": 0, 00:10:36.880 "copy_latency_ticks": 0, 00:10:36.880 "max_copy_latency_ticks": 0, 00:10:36.880 "min_copy_latency_ticks": 0 00:10:36.880 }, 00:10:36.880 { 00:10:36.880 "thread_id": 3, 00:10:36.880 "bytes_read": 408944640, 00:10:36.880 "num_read_ops": 99840, 00:10:36.880 "bytes_written": 0, 00:10:36.880 "num_write_ops": 0, 00:10:36.880 "bytes_unmapped": 0, 00:10:36.880 "num_unmap_ops": 0, 00:10:36.880 "bytes_copied": 0, 00:10:36.880 "num_copy_ops": 0, 00:10:36.880 "read_latency_ticks": 1174202848716, 00:10:36.880 "max_read_latency_ticks": 14292602, 00:10:36.880 "min_read_latency_ticks": 7875150, 00:10:36.880 "write_latency_ticks": 0, 00:10:36.880 "max_write_latency_ticks": 0, 00:10:36.880 "min_write_latency_ticks": 0, 00:10:36.880 "unmap_latency_ticks": 0, 00:10:36.880 "max_unmap_latency_ticks": 0, 00:10:36.880 "min_unmap_latency_ticks": 0, 00:10:36.880 "copy_latency_ticks": 0, 00:10:36.880 "max_copy_latency_ticks": 0, 00:10:36.880 "min_copy_latency_ticks": 0 00:10:36.880 } 00:10:36.880 ] 00:10:36.880 }' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=97536 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=97536 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=99840 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=197376 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:36.880 "tick_rate": 2300000000, 00:10:36.880 "ticks": 5324649114291834, 00:10:36.880 "bdevs": [ 00:10:36.880 { 00:10:36.880 "name": "Malloc_STAT", 00:10:36.880 "bytes_read": 859877888, 00:10:36.880 "num_read_ops": 209924, 00:10:36.880 "bytes_written": 0, 00:10:36.880 "num_write_ops": 0, 00:10:36.880 "bytes_unmapped": 0, 00:10:36.880 "num_unmap_ops": 0, 00:10:36.880 "bytes_copied": 0, 00:10:36.880 "num_copy_ops": 0, 00:10:36.880 "read_latency_ticks": 2496955933186, 00:10:36.880 "max_read_latency_ticks": 14292602, 00:10:36.880 "min_read_latency_ticks": 284332, 00:10:36.880 "write_latency_ticks": 0, 00:10:36.880 "max_write_latency_ticks": 0, 00:10:36.880 "min_write_latency_ticks": 0, 00:10:36.880 "unmap_latency_ticks": 0, 00:10:36.880 "max_unmap_latency_ticks": 0, 00:10:36.880 "min_unmap_latency_ticks": 0, 00:10:36.880 "copy_latency_ticks": 0, 00:10:36.880 "max_copy_latency_ticks": 0, 00:10:36.880 "min_copy_latency_ticks": 0, 00:10:36.880 "io_error": {} 00:10:36.880 } 00:10:36.880 ] 00:10:36.880 }' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=209924 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 197376 -lt 190212 ']' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 197376 -gt 209924 ']' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:36.880 00:10:36.880 Latency(us) 00:10:36.880 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:36.880 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:36.880 Malloc_STAT : 2.20 48832.40 190.75 0.00 0.00 5229.55 1894.85 5584.81 00:10:36.880 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:36.880 Malloc_STAT : 2.20 49994.03 195.29 0.00 0.00 5108.05 1837.86 6240.17 00:10:36.880 =================================================================================================================== 00:10:36.880 Total : 98826.42 386.04 0.00 0.00 5168.08 1837.86 6240.17 00:10:36.880 0 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 80748 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 80748 ']' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 80748 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:36.880 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 80748 00:10:37.139 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:37.139 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:37.139 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 80748' 00:10:37.139 killing process with pid 80748 00:10:37.139 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 80748 00:10:37.139 Received shutdown signal, test time was about 2.279953 seconds 00:10:37.139 00:10:37.139 Latency(us) 00:10:37.139 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:37.139 =================================================================================================================== 00:10:37.139 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:37.139 09:14:45 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 80748 00:10:37.139 09:14:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:37.139 00:10:37.139 real 0m3.642s 00:10:37.139 user 0m7.226s 00:10:37.139 sys 0m0.497s 00:10:37.139 09:14:46 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.139 09:14:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:37.139 ************************************ 00:10:37.139 END TEST bdev_stat 00:10:37.139 ************************************ 00:10:37.397 09:14:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:37.397 09:14:46 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:37.397 00:10:37.397 real 1m56.115s 00:10:37.397 user 7m9.574s 00:10:37.397 sys 0m23.208s 00:10:37.397 09:14:46 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.397 09:14:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:37.397 ************************************ 00:10:37.397 END TEST blockdev_general 00:10:37.397 ************************************ 00:10:37.397 09:14:46 -- common/autotest_common.sh@1142 -- # return 0 00:10:37.397 09:14:46 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:37.397 09:14:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:37.397 09:14:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.397 09:14:46 -- common/autotest_common.sh@10 -- # set +x 00:10:37.397 ************************************ 00:10:37.397 START TEST bdev_raid 00:10:37.397 ************************************ 00:10:37.397 09:14:46 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:37.397 * Looking for test storage... 00:10:37.397 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:37.397 09:14:46 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:37.397 09:14:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:37.655 09:14:46 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:37.655 09:14:46 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:37.655 09:14:46 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:37.655 09:14:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:37.655 09:14:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.655 09:14:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:37.655 ************************************ 00:10:37.655 START TEST raid_function_test_raid0 00:10:37.655 ************************************ 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=81364 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 81364' 00:10:37.655 Process raid pid: 81364 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 81364 /var/tmp/spdk-raid.sock 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 81364 ']' 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:37.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:37.655 09:14:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:37.655 [2024-07-15 09:14:46.453881] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:37.655 [2024-07-15 09:14:46.453955] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:37.655 [2024-07-15 09:14:46.576499] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.913 [2024-07-15 09:14:46.674350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.913 [2024-07-15 09:14:46.740912] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.913 [2024-07-15 09:14:46.740958] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:38.478 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:38.735 [2024-07-15 09:14:47.653888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:38.735 [2024-07-15 09:14:47.655353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:38.735 [2024-07-15 09:14:47.655413] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9c5bd0 00:10:38.735 [2024-07-15 09:14:47.655424] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:38.735 [2024-07-15 09:14:47.655616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c5b10 00:10:38.735 [2024-07-15 09:14:47.655737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9c5bd0 00:10:38.735 [2024-07-15 09:14:47.655747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x9c5bd0 00:10:38.735 [2024-07-15 09:14:47.655850] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.735 Base_1 00:10:38.735 Base_2 00:10:38.736 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:38.736 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:38.736 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:38.993 09:14:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:39.331 [2024-07-15 09:14:48.151237] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb798e0 00:10:39.331 /dev/nbd0 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:39.331 1+0 records in 00:10:39.331 1+0 records out 00:10:39.331 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244861 s, 16.7 MB/s 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.331 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:39.590 { 00:10:39.590 "nbd_device": "/dev/nbd0", 00:10:39.590 "bdev_name": "raid" 00:10:39.590 } 00:10:39.590 ]' 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:39.590 { 00:10:39.590 "nbd_device": "/dev/nbd0", 00:10:39.590 "bdev_name": "raid" 00:10:39.590 } 00:10:39.590 ]' 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:39.590 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:39.849 4096+0 records in 00:10:39.849 4096+0 records out 00:10:39.849 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0305155 s, 68.7 MB/s 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:39.849 4096+0 records in 00:10:39.849 4096+0 records out 00:10:39.849 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.209168 s, 10.0 MB/s 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:39.849 128+0 records in 00:10:39.849 128+0 records out 00:10:39.849 65536 bytes (66 kB, 64 KiB) copied, 0.000836439 s, 78.4 MB/s 00:10:39.849 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:40.108 2035+0 records in 00:10:40.108 2035+0 records out 00:10:40.108 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.010666 s, 97.7 MB/s 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:40.108 456+0 records in 00:10:40.108 456+0 records out 00:10:40.108 233472 bytes (233 kB, 228 KiB) copied, 0.00273298 s, 85.4 MB/s 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:40.108 09:14:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:40.367 [2024-07-15 09:14:49.143637] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.367 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 81364 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 81364 ']' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 81364 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 81364 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 81364' 00:10:40.626 killing process with pid 81364 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 81364 00:10:40.626 [2024-07-15 09:14:49.508520] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:40.626 [2024-07-15 09:14:49.508591] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:40.626 [2024-07-15 09:14:49.508634] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:40.626 [2024-07-15 09:14:49.508655] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9c5bd0 name raid, state offline 00:10:40.626 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 81364 00:10:40.626 [2024-07-15 09:14:49.525796] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:40.886 09:14:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:40.886 00:10:40.886 real 0m3.349s 00:10:40.886 user 0m4.477s 00:10:40.886 sys 0m1.234s 00:10:40.886 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:40.886 09:14:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:40.886 ************************************ 00:10:40.886 END TEST raid_function_test_raid0 00:10:40.886 ************************************ 00:10:40.886 09:14:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:40.886 09:14:49 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:40.886 09:14:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:40.886 09:14:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:40.886 09:14:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:40.886 ************************************ 00:10:40.886 START TEST raid_function_test_concat 00:10:40.886 ************************************ 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=81960 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 81960' 00:10:40.886 Process raid pid: 81960 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 81960 /var/tmp/spdk-raid.sock 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 81960 ']' 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:40.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:40.886 09:14:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:41.145 [2024-07-15 09:14:49.892761] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:41.145 [2024-07-15 09:14:49.892836] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:41.145 [2024-07-15 09:14:50.023932] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.404 [2024-07-15 09:14:50.132075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.404 [2024-07-15 09:14:50.191776] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.404 [2024-07-15 09:14:50.191805] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:41.971 09:14:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:42.229 [2024-07-15 09:14:51.012598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:42.229 [2024-07-15 09:14:51.014074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:42.229 [2024-07-15 09:14:51.014135] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b76bd0 00:10:42.229 [2024-07-15 09:14:51.014146] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:42.229 [2024-07-15 09:14:51.014336] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b76b10 00:10:42.229 [2024-07-15 09:14:51.014457] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b76bd0 00:10:42.229 [2024-07-15 09:14:51.014467] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1b76bd0 00:10:42.229 [2024-07-15 09:14:51.014567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:42.229 Base_1 00:10:42.229 Base_2 00:10:42.229 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:42.229 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:42.229 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:42.487 [2024-07-15 09:14:51.381579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d2a8e0 00:10:42.487 /dev/nbd0 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:42.487 1+0 records in 00:10:42.487 1+0 records out 00:10:42.487 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250455 s, 16.4 MB/s 00:10:42.487 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:42.745 { 00:10:42.745 "nbd_device": "/dev/nbd0", 00:10:42.745 "bdev_name": "raid" 00:10:42.745 } 00:10:42.745 ]' 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:42.745 { 00:10:42.745 "nbd_device": "/dev/nbd0", 00:10:42.745 "bdev_name": "raid" 00:10:42.745 } 00:10:42.745 ]' 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:42.745 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:43.003 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:43.003 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:43.003 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:43.003 4096+0 records in 00:10:43.003 4096+0 records out 00:10:43.003 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0300738 s, 69.7 MB/s 00:10:43.003 09:14:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:43.261 4096+0 records in 00:10:43.261 4096+0 records out 00:10:43.261 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.275885 s, 7.6 MB/s 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:43.261 128+0 records in 00:10:43.261 128+0 records out 00:10:43.261 65536 bytes (66 kB, 64 KiB) copied, 0.000853643 s, 76.8 MB/s 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:43.261 2035+0 records in 00:10:43.261 2035+0 records out 00:10:43.261 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0115509 s, 90.2 MB/s 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:43.261 456+0 records in 00:10:43.261 456+0 records out 00:10:43.261 233472 bytes (233 kB, 228 KiB) copied, 0.00272831 s, 85.6 MB/s 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:43.261 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:43.518 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:43.519 [2024-07-15 09:14:52.380651] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:43.519 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 81960 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 81960 ']' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 81960 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:43.776 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 81960 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 81960' 00:10:44.034 killing process with pid 81960 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 81960 00:10:44.034 [2024-07-15 09:14:52.748071] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:44.034 [2024-07-15 09:14:52.748138] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:44.034 [2024-07-15 09:14:52.748182] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:44.034 [2024-07-15 09:14:52.748195] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b76bd0 name raid, state offline 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 81960 00:10:44.034 [2024-07-15 09:14:52.764694] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:44.034 00:10:44.034 real 0m3.139s 00:10:44.034 user 0m4.045s 00:10:44.034 sys 0m1.204s 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.034 09:14:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:44.034 ************************************ 00:10:44.034 END TEST raid_function_test_concat 00:10:44.034 ************************************ 00:10:44.292 09:14:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:44.292 09:14:53 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:44.292 09:14:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:44.292 09:14:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.292 09:14:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:44.292 ************************************ 00:10:44.292 START TEST raid0_resize_test 00:10:44.292 ************************************ 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=82417 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 82417' 00:10:44.292 Process raid pid: 82417 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 82417 /var/tmp/spdk-raid.sock 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 82417 ']' 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:44.292 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:44.292 09:14:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.292 [2024-07-15 09:14:53.111224] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:44.292 [2024-07-15 09:14:53.111289] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:44.292 [2024-07-15 09:14:53.242865] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.549 [2024-07-15 09:14:53.350375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.549 [2024-07-15 09:14:53.417740] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.549 [2024-07-15 09:14:53.417783] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:45.143 09:14:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:45.143 09:14:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:45.143 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:45.401 Base_1 00:10:45.401 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:45.658 Base_2 00:10:45.658 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:45.917 [2024-07-15 09:14:54.746226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:45.917 [2024-07-15 09:14:54.747603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:45.917 [2024-07-15 09:14:54.747653] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15d9780 00:10:45.917 [2024-07-15 09:14:54.747666] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:45.917 [2024-07-15 09:14:54.747867] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1125020 00:10:45.917 [2024-07-15 09:14:54.747972] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15d9780 00:10:45.917 [2024-07-15 09:14:54.747983] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x15d9780 00:10:45.917 [2024-07-15 09:14:54.748090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:45.917 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:46.175 [2024-07-15 09:14:54.978830] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:46.175 [2024-07-15 09:14:54.978851] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:46.175 true 00:10:46.175 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:46.175 09:14:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:46.433 [2024-07-15 09:14:55.223624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:46.433 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:46.433 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:46.433 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:46.433 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:46.691 [2024-07-15 09:14:55.468095] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:46.691 [2024-07-15 09:14:55.468114] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:46.691 [2024-07-15 09:14:55.468139] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:46.691 true 00:10:46.691 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:46.691 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:46.948 [2024-07-15 09:14:55.712967] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 82417 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 82417 ']' 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 82417 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 82417 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 82417' 00:10:46.948 killing process with pid 82417 00:10:46.948 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 82417 00:10:46.948 [2024-07-15 09:14:55.765642] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:46.949 [2024-07-15 09:14:55.765690] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.949 [2024-07-15 09:14:55.765729] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:46.949 [2024-07-15 09:14:55.765740] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15d9780 name Raid, state offline 00:10:46.949 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 82417 00:10:46.949 [2024-07-15 09:14:55.766979] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:47.207 09:14:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:47.207 00:10:47.207 real 0m2.907s 00:10:47.207 user 0m4.463s 00:10:47.207 sys 0m0.651s 00:10:47.207 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:47.207 09:14:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:47.207 ************************************ 00:10:47.207 END TEST raid0_resize_test 00:10:47.207 ************************************ 00:10:47.207 09:14:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:47.207 09:14:55 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:47.207 09:14:55 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:47.207 09:14:55 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:47.207 09:14:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:47.207 09:14:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:47.207 09:14:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:47.207 ************************************ 00:10:47.207 START TEST raid_state_function_test 00:10:47.207 ************************************ 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=82813 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 82813' 00:10:47.207 Process raid pid: 82813 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 82813 /var/tmp/spdk-raid.sock 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 82813 ']' 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:47.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:47.207 09:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:47.207 [2024-07-15 09:14:56.094245] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:47.207 [2024-07-15 09:14:56.094312] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:47.466 [2024-07-15 09:14:56.224098] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.466 [2024-07-15 09:14:56.331094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.466 [2024-07-15 09:14:56.395797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.466 [2024-07-15 09:14:56.395830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:48.401 [2024-07-15 09:14:57.259533] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:48.401 [2024-07-15 09:14:57.259575] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:48.401 [2024-07-15 09:14:57.259586] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:48.401 [2024-07-15 09:14:57.259597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.401 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.659 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.659 "name": "Existed_Raid", 00:10:48.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.659 "strip_size_kb": 64, 00:10:48.659 "state": "configuring", 00:10:48.659 "raid_level": "raid0", 00:10:48.659 "superblock": false, 00:10:48.659 "num_base_bdevs": 2, 00:10:48.659 "num_base_bdevs_discovered": 0, 00:10:48.659 "num_base_bdevs_operational": 2, 00:10:48.659 "base_bdevs_list": [ 00:10:48.659 { 00:10:48.659 "name": "BaseBdev1", 00:10:48.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.659 "is_configured": false, 00:10:48.659 "data_offset": 0, 00:10:48.659 "data_size": 0 00:10:48.659 }, 00:10:48.659 { 00:10:48.659 "name": "BaseBdev2", 00:10:48.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.659 "is_configured": false, 00:10:48.659 "data_offset": 0, 00:10:48.659 "data_size": 0 00:10:48.659 } 00:10:48.659 ] 00:10:48.659 }' 00:10:48.659 09:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.659 09:14:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.225 09:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:49.483 [2024-07-15 09:14:58.282110] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:49.483 [2024-07-15 09:14:58.282140] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206ba80 name Existed_Raid, state configuring 00:10:49.483 09:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:49.742 [2024-07-15 09:14:58.526774] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:49.742 [2024-07-15 09:14:58.526813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:49.742 [2024-07-15 09:14:58.526822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:49.742 [2024-07-15 09:14:58.526834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:49.742 09:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:50.000 [2024-07-15 09:14:58.781231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:50.000 BaseBdev1 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:50.000 09:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:50.258 09:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:50.517 [ 00:10:50.517 { 00:10:50.517 "name": "BaseBdev1", 00:10:50.517 "aliases": [ 00:10:50.517 "39460909-401a-4fc5-9ccb-0e1919899083" 00:10:50.517 ], 00:10:50.517 "product_name": "Malloc disk", 00:10:50.517 "block_size": 512, 00:10:50.517 "num_blocks": 65536, 00:10:50.517 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:50.517 "assigned_rate_limits": { 00:10:50.517 "rw_ios_per_sec": 0, 00:10:50.517 "rw_mbytes_per_sec": 0, 00:10:50.517 "r_mbytes_per_sec": 0, 00:10:50.517 "w_mbytes_per_sec": 0 00:10:50.517 }, 00:10:50.517 "claimed": true, 00:10:50.517 "claim_type": "exclusive_write", 00:10:50.517 "zoned": false, 00:10:50.517 "supported_io_types": { 00:10:50.517 "read": true, 00:10:50.517 "write": true, 00:10:50.517 "unmap": true, 00:10:50.517 "flush": true, 00:10:50.517 "reset": true, 00:10:50.517 "nvme_admin": false, 00:10:50.517 "nvme_io": false, 00:10:50.517 "nvme_io_md": false, 00:10:50.517 "write_zeroes": true, 00:10:50.517 "zcopy": true, 00:10:50.517 "get_zone_info": false, 00:10:50.517 "zone_management": false, 00:10:50.517 "zone_append": false, 00:10:50.517 "compare": false, 00:10:50.517 "compare_and_write": false, 00:10:50.517 "abort": true, 00:10:50.517 "seek_hole": false, 00:10:50.517 "seek_data": false, 00:10:50.517 "copy": true, 00:10:50.517 "nvme_iov_md": false 00:10:50.517 }, 00:10:50.517 "memory_domains": [ 00:10:50.517 { 00:10:50.517 "dma_device_id": "system", 00:10:50.517 "dma_device_type": 1 00:10:50.517 }, 00:10:50.517 { 00:10:50.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.517 "dma_device_type": 2 00:10:50.517 } 00:10:50.517 ], 00:10:50.517 "driver_specific": {} 00:10:50.517 } 00:10:50.517 ] 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.517 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:50.776 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.776 "name": "Existed_Raid", 00:10:50.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.776 "strip_size_kb": 64, 00:10:50.776 "state": "configuring", 00:10:50.776 "raid_level": "raid0", 00:10:50.776 "superblock": false, 00:10:50.776 "num_base_bdevs": 2, 00:10:50.776 "num_base_bdevs_discovered": 1, 00:10:50.776 "num_base_bdevs_operational": 2, 00:10:50.776 "base_bdevs_list": [ 00:10:50.776 { 00:10:50.776 "name": "BaseBdev1", 00:10:50.776 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:50.776 "is_configured": true, 00:10:50.776 "data_offset": 0, 00:10:50.776 "data_size": 65536 00:10:50.776 }, 00:10:50.776 { 00:10:50.776 "name": "BaseBdev2", 00:10:50.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.776 "is_configured": false, 00:10:50.776 "data_offset": 0, 00:10:50.776 "data_size": 0 00:10:50.776 } 00:10:50.776 ] 00:10:50.776 }' 00:10:50.776 09:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.776 09:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.340 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:51.598 [2024-07-15 09:15:00.353407] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:51.598 [2024-07-15 09:15:00.353450] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206b350 name Existed_Raid, state configuring 00:10:51.598 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:51.857 [2024-07-15 09:15:00.610111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:51.857 [2024-07-15 09:15:00.611603] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:51.857 [2024-07-15 09:15:00.611637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.857 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:52.114 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.114 "name": "Existed_Raid", 00:10:52.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.114 "strip_size_kb": 64, 00:10:52.114 "state": "configuring", 00:10:52.114 "raid_level": "raid0", 00:10:52.114 "superblock": false, 00:10:52.114 "num_base_bdevs": 2, 00:10:52.114 "num_base_bdevs_discovered": 1, 00:10:52.114 "num_base_bdevs_operational": 2, 00:10:52.114 "base_bdevs_list": [ 00:10:52.114 { 00:10:52.114 "name": "BaseBdev1", 00:10:52.114 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:52.114 "is_configured": true, 00:10:52.114 "data_offset": 0, 00:10:52.114 "data_size": 65536 00:10:52.114 }, 00:10:52.114 { 00:10:52.114 "name": "BaseBdev2", 00:10:52.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.114 "is_configured": false, 00:10:52.114 "data_offset": 0, 00:10:52.114 "data_size": 0 00:10:52.114 } 00:10:52.114 ] 00:10:52.114 }' 00:10:52.114 09:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.114 09:15:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.677 09:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:52.936 [2024-07-15 09:15:01.716360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:52.936 [2024-07-15 09:15:01.716398] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x206c000 00:10:52.936 [2024-07-15 09:15:01.716407] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:52.936 [2024-07-15 09:15:01.716596] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f860c0 00:10:52.936 [2024-07-15 09:15:01.716721] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x206c000 00:10:52.936 [2024-07-15 09:15:01.716732] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x206c000 00:10:52.936 [2024-07-15 09:15:01.716892] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:52.936 BaseBdev2 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:52.936 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:53.194 09:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:53.452 [ 00:10:53.452 { 00:10:53.452 "name": "BaseBdev2", 00:10:53.452 "aliases": [ 00:10:53.452 "18749b34-9aec-42ba-a4bc-f90a3b2c55c1" 00:10:53.452 ], 00:10:53.452 "product_name": "Malloc disk", 00:10:53.452 "block_size": 512, 00:10:53.452 "num_blocks": 65536, 00:10:53.452 "uuid": "18749b34-9aec-42ba-a4bc-f90a3b2c55c1", 00:10:53.452 "assigned_rate_limits": { 00:10:53.452 "rw_ios_per_sec": 0, 00:10:53.452 "rw_mbytes_per_sec": 0, 00:10:53.452 "r_mbytes_per_sec": 0, 00:10:53.452 "w_mbytes_per_sec": 0 00:10:53.452 }, 00:10:53.452 "claimed": true, 00:10:53.452 "claim_type": "exclusive_write", 00:10:53.452 "zoned": false, 00:10:53.452 "supported_io_types": { 00:10:53.452 "read": true, 00:10:53.452 "write": true, 00:10:53.452 "unmap": true, 00:10:53.452 "flush": true, 00:10:53.452 "reset": true, 00:10:53.452 "nvme_admin": false, 00:10:53.452 "nvme_io": false, 00:10:53.452 "nvme_io_md": false, 00:10:53.452 "write_zeroes": true, 00:10:53.452 "zcopy": true, 00:10:53.452 "get_zone_info": false, 00:10:53.452 "zone_management": false, 00:10:53.452 "zone_append": false, 00:10:53.452 "compare": false, 00:10:53.452 "compare_and_write": false, 00:10:53.452 "abort": true, 00:10:53.452 "seek_hole": false, 00:10:53.452 "seek_data": false, 00:10:53.452 "copy": true, 00:10:53.452 "nvme_iov_md": false 00:10:53.452 }, 00:10:53.452 "memory_domains": [ 00:10:53.452 { 00:10:53.452 "dma_device_id": "system", 00:10:53.452 "dma_device_type": 1 00:10:53.452 }, 00:10:53.452 { 00:10:53.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.452 "dma_device_type": 2 00:10:53.452 } 00:10:53.452 ], 00:10:53.452 "driver_specific": {} 00:10:53.452 } 00:10:53.452 ] 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.452 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:53.710 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.710 "name": "Existed_Raid", 00:10:53.710 "uuid": "4e395e70-cdee-470b-93cd-72ae8765b7a4", 00:10:53.710 "strip_size_kb": 64, 00:10:53.710 "state": "online", 00:10:53.710 "raid_level": "raid0", 00:10:53.710 "superblock": false, 00:10:53.710 "num_base_bdevs": 2, 00:10:53.710 "num_base_bdevs_discovered": 2, 00:10:53.710 "num_base_bdevs_operational": 2, 00:10:53.710 "base_bdevs_list": [ 00:10:53.710 { 00:10:53.710 "name": "BaseBdev1", 00:10:53.710 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:53.710 "is_configured": true, 00:10:53.710 "data_offset": 0, 00:10:53.710 "data_size": 65536 00:10:53.710 }, 00:10:53.710 { 00:10:53.710 "name": "BaseBdev2", 00:10:53.710 "uuid": "18749b34-9aec-42ba-a4bc-f90a3b2c55c1", 00:10:53.710 "is_configured": true, 00:10:53.710 "data_offset": 0, 00:10:53.710 "data_size": 65536 00:10:53.710 } 00:10:53.710 ] 00:10:53.710 }' 00:10:53.710 09:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.710 09:15:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:54.643 [2024-07-15 09:15:03.549498] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:54.643 "name": "Existed_Raid", 00:10:54.643 "aliases": [ 00:10:54.643 "4e395e70-cdee-470b-93cd-72ae8765b7a4" 00:10:54.643 ], 00:10:54.643 "product_name": "Raid Volume", 00:10:54.643 "block_size": 512, 00:10:54.643 "num_blocks": 131072, 00:10:54.643 "uuid": "4e395e70-cdee-470b-93cd-72ae8765b7a4", 00:10:54.643 "assigned_rate_limits": { 00:10:54.643 "rw_ios_per_sec": 0, 00:10:54.643 "rw_mbytes_per_sec": 0, 00:10:54.643 "r_mbytes_per_sec": 0, 00:10:54.643 "w_mbytes_per_sec": 0 00:10:54.643 }, 00:10:54.643 "claimed": false, 00:10:54.643 "zoned": false, 00:10:54.643 "supported_io_types": { 00:10:54.643 "read": true, 00:10:54.643 "write": true, 00:10:54.643 "unmap": true, 00:10:54.643 "flush": true, 00:10:54.643 "reset": true, 00:10:54.643 "nvme_admin": false, 00:10:54.643 "nvme_io": false, 00:10:54.643 "nvme_io_md": false, 00:10:54.643 "write_zeroes": true, 00:10:54.643 "zcopy": false, 00:10:54.643 "get_zone_info": false, 00:10:54.643 "zone_management": false, 00:10:54.643 "zone_append": false, 00:10:54.643 "compare": false, 00:10:54.643 "compare_and_write": false, 00:10:54.643 "abort": false, 00:10:54.643 "seek_hole": false, 00:10:54.643 "seek_data": false, 00:10:54.643 "copy": false, 00:10:54.643 "nvme_iov_md": false 00:10:54.643 }, 00:10:54.643 "memory_domains": [ 00:10:54.643 { 00:10:54.643 "dma_device_id": "system", 00:10:54.643 "dma_device_type": 1 00:10:54.643 }, 00:10:54.643 { 00:10:54.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.643 "dma_device_type": 2 00:10:54.643 }, 00:10:54.643 { 00:10:54.643 "dma_device_id": "system", 00:10:54.643 "dma_device_type": 1 00:10:54.643 }, 00:10:54.643 { 00:10:54.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.643 "dma_device_type": 2 00:10:54.643 } 00:10:54.643 ], 00:10:54.643 "driver_specific": { 00:10:54.643 "raid": { 00:10:54.643 "uuid": "4e395e70-cdee-470b-93cd-72ae8765b7a4", 00:10:54.643 "strip_size_kb": 64, 00:10:54.643 "state": "online", 00:10:54.643 "raid_level": "raid0", 00:10:54.643 "superblock": false, 00:10:54.643 "num_base_bdevs": 2, 00:10:54.643 "num_base_bdevs_discovered": 2, 00:10:54.643 "num_base_bdevs_operational": 2, 00:10:54.643 "base_bdevs_list": [ 00:10:54.643 { 00:10:54.643 "name": "BaseBdev1", 00:10:54.643 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:54.643 "is_configured": true, 00:10:54.643 "data_offset": 0, 00:10:54.643 "data_size": 65536 00:10:54.643 }, 00:10:54.643 { 00:10:54.643 "name": "BaseBdev2", 00:10:54.643 "uuid": "18749b34-9aec-42ba-a4bc-f90a3b2c55c1", 00:10:54.643 "is_configured": true, 00:10:54.643 "data_offset": 0, 00:10:54.643 "data_size": 65536 00:10:54.643 } 00:10:54.643 ] 00:10:54.643 } 00:10:54.643 } 00:10:54.643 }' 00:10:54.643 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:54.901 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:54.901 BaseBdev2' 00:10:54.901 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:54.901 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:54.901 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:54.901 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:54.901 "name": "BaseBdev1", 00:10:54.901 "aliases": [ 00:10:54.901 "39460909-401a-4fc5-9ccb-0e1919899083" 00:10:54.901 ], 00:10:54.901 "product_name": "Malloc disk", 00:10:54.901 "block_size": 512, 00:10:54.901 "num_blocks": 65536, 00:10:54.901 "uuid": "39460909-401a-4fc5-9ccb-0e1919899083", 00:10:54.901 "assigned_rate_limits": { 00:10:54.901 "rw_ios_per_sec": 0, 00:10:54.901 "rw_mbytes_per_sec": 0, 00:10:54.901 "r_mbytes_per_sec": 0, 00:10:54.901 "w_mbytes_per_sec": 0 00:10:54.901 }, 00:10:54.901 "claimed": true, 00:10:54.901 "claim_type": "exclusive_write", 00:10:54.901 "zoned": false, 00:10:54.901 "supported_io_types": { 00:10:54.901 "read": true, 00:10:54.901 "write": true, 00:10:54.901 "unmap": true, 00:10:54.901 "flush": true, 00:10:54.901 "reset": true, 00:10:54.901 "nvme_admin": false, 00:10:54.901 "nvme_io": false, 00:10:54.901 "nvme_io_md": false, 00:10:54.901 "write_zeroes": true, 00:10:54.901 "zcopy": true, 00:10:54.901 "get_zone_info": false, 00:10:54.901 "zone_management": false, 00:10:54.901 "zone_append": false, 00:10:54.901 "compare": false, 00:10:54.901 "compare_and_write": false, 00:10:54.901 "abort": true, 00:10:54.901 "seek_hole": false, 00:10:54.901 "seek_data": false, 00:10:54.901 "copy": true, 00:10:54.901 "nvme_iov_md": false 00:10:54.901 }, 00:10:54.901 "memory_domains": [ 00:10:54.901 { 00:10:54.901 "dma_device_id": "system", 00:10:54.901 "dma_device_type": 1 00:10:54.901 }, 00:10:54.901 { 00:10:54.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.901 "dma_device_type": 2 00:10:54.901 } 00:10:54.901 ], 00:10:54.901 "driver_specific": {} 00:10:54.901 }' 00:10:55.158 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.158 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.158 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.158 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.158 09:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.158 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.158 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.158 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:55.416 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.674 "name": "BaseBdev2", 00:10:55.674 "aliases": [ 00:10:55.674 "18749b34-9aec-42ba-a4bc-f90a3b2c55c1" 00:10:55.674 ], 00:10:55.674 "product_name": "Malloc disk", 00:10:55.674 "block_size": 512, 00:10:55.674 "num_blocks": 65536, 00:10:55.674 "uuid": "18749b34-9aec-42ba-a4bc-f90a3b2c55c1", 00:10:55.674 "assigned_rate_limits": { 00:10:55.674 "rw_ios_per_sec": 0, 00:10:55.674 "rw_mbytes_per_sec": 0, 00:10:55.674 "r_mbytes_per_sec": 0, 00:10:55.674 "w_mbytes_per_sec": 0 00:10:55.674 }, 00:10:55.674 "claimed": true, 00:10:55.674 "claim_type": "exclusive_write", 00:10:55.674 "zoned": false, 00:10:55.674 "supported_io_types": { 00:10:55.674 "read": true, 00:10:55.674 "write": true, 00:10:55.674 "unmap": true, 00:10:55.674 "flush": true, 00:10:55.674 "reset": true, 00:10:55.674 "nvme_admin": false, 00:10:55.674 "nvme_io": false, 00:10:55.674 "nvme_io_md": false, 00:10:55.674 "write_zeroes": true, 00:10:55.674 "zcopy": true, 00:10:55.674 "get_zone_info": false, 00:10:55.674 "zone_management": false, 00:10:55.674 "zone_append": false, 00:10:55.674 "compare": false, 00:10:55.674 "compare_and_write": false, 00:10:55.674 "abort": true, 00:10:55.674 "seek_hole": false, 00:10:55.674 "seek_data": false, 00:10:55.674 "copy": true, 00:10:55.674 "nvme_iov_md": false 00:10:55.674 }, 00:10:55.674 "memory_domains": [ 00:10:55.674 { 00:10:55.674 "dma_device_id": "system", 00:10:55.674 "dma_device_type": 1 00:10:55.674 }, 00:10:55.674 { 00:10:55.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.674 "dma_device_type": 2 00:10:55.674 } 00:10:55.674 ], 00:10:55.674 "driver_specific": {} 00:10:55.674 }' 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.674 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.932 09:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:56.190 [2024-07-15 09:15:05.045262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:56.190 [2024-07-15 09:15:05.045290] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.190 [2024-07-15 09:15:05.045330] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.190 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.191 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:56.448 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.448 "name": "Existed_Raid", 00:10:56.448 "uuid": "4e395e70-cdee-470b-93cd-72ae8765b7a4", 00:10:56.448 "strip_size_kb": 64, 00:10:56.448 "state": "offline", 00:10:56.448 "raid_level": "raid0", 00:10:56.448 "superblock": false, 00:10:56.448 "num_base_bdevs": 2, 00:10:56.448 "num_base_bdevs_discovered": 1, 00:10:56.448 "num_base_bdevs_operational": 1, 00:10:56.448 "base_bdevs_list": [ 00:10:56.448 { 00:10:56.448 "name": null, 00:10:56.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:56.448 "is_configured": false, 00:10:56.448 "data_offset": 0, 00:10:56.448 "data_size": 65536 00:10:56.448 }, 00:10:56.448 { 00:10:56.448 "name": "BaseBdev2", 00:10:56.448 "uuid": "18749b34-9aec-42ba-a4bc-f90a3b2c55c1", 00:10:56.448 "is_configured": true, 00:10:56.448 "data_offset": 0, 00:10:56.448 "data_size": 65536 00:10:56.448 } 00:10:56.448 ] 00:10:56.448 }' 00:10:56.448 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.448 09:15:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.014 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:57.014 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:57.014 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.014 09:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:57.272 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:57.272 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:57.272 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:57.530 [2024-07-15 09:15:06.390193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:57.530 [2024-07-15 09:15:06.390245] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206c000 name Existed_Raid, state offline 00:10:57.530 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:57.530 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:57.530 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.530 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 82813 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 82813 ']' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 82813 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 82813 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 82813' 00:10:57.788 killing process with pid 82813 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 82813 00:10:57.788 [2024-07-15 09:15:06.710672] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:57.788 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 82813 00:10:57.788 [2024-07-15 09:15:06.711545] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:58.046 00:10:58.046 real 0m10.890s 00:10:58.046 user 0m19.479s 00:10:58.046 sys 0m1.948s 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.046 ************************************ 00:10:58.046 END TEST raid_state_function_test 00:10:58.046 ************************************ 00:10:58.046 09:15:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:58.046 09:15:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:58.046 09:15:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:58.046 09:15:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.046 09:15:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.046 ************************************ 00:10:58.046 START TEST raid_state_function_test_sb 00:10:58.046 ************************************ 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:58.046 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:58.304 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:58.304 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.304 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:58.304 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.304 09:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=84950 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 84950' 00:10:58.304 Process raid pid: 84950 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 84950 /var/tmp/spdk-raid.sock 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 84950 ']' 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.304 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:58.304 09:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:58.304 [2024-07-15 09:15:07.066432] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:10:58.304 [2024-07-15 09:15:07.066501] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.304 [2024-07-15 09:15:07.195444] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.562 [2024-07-15 09:15:07.302003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.562 [2024-07-15 09:15:07.371813] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.562 [2024-07-15 09:15:07.371852] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.129 09:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:59.129 09:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:59.129 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:59.449 [2024-07-15 09:15:08.234969] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:59.449 [2024-07-15 09:15:08.235014] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:59.449 [2024-07-15 09:15:08.235025] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:59.449 [2024-07-15 09:15:08.235037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.449 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.707 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.707 "name": "Existed_Raid", 00:10:59.707 "uuid": "be587521-ed40-4e32-8812-b35f354fe95f", 00:10:59.707 "strip_size_kb": 64, 00:10:59.707 "state": "configuring", 00:10:59.707 "raid_level": "raid0", 00:10:59.707 "superblock": true, 00:10:59.707 "num_base_bdevs": 2, 00:10:59.707 "num_base_bdevs_discovered": 0, 00:10:59.707 "num_base_bdevs_operational": 2, 00:10:59.707 "base_bdevs_list": [ 00:10:59.707 { 00:10:59.707 "name": "BaseBdev1", 00:10:59.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.707 "is_configured": false, 00:10:59.707 "data_offset": 0, 00:10:59.707 "data_size": 0 00:10:59.707 }, 00:10:59.708 { 00:10:59.708 "name": "BaseBdev2", 00:10:59.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.708 "is_configured": false, 00:10:59.708 "data_offset": 0, 00:10:59.708 "data_size": 0 00:10:59.708 } 00:10:59.708 ] 00:10:59.708 }' 00:10:59.708 09:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.708 09:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:00.273 09:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:00.531 [2024-07-15 09:15:09.305646] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:00.531 [2024-07-15 09:15:09.305676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d71a80 name Existed_Raid, state configuring 00:11:00.531 09:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:00.789 [2024-07-15 09:15:09.550314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:00.789 [2024-07-15 09:15:09.550345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:00.789 [2024-07-15 09:15:09.550355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:00.789 [2024-07-15 09:15:09.550368] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:00.789 09:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:01.047 [2024-07-15 09:15:09.804838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:01.047 BaseBdev1 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:01.047 09:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:01.305 09:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:01.563 [ 00:11:01.563 { 00:11:01.563 "name": "BaseBdev1", 00:11:01.563 "aliases": [ 00:11:01.563 "23b08708-9a7c-4d3b-94b6-c8acd3932f71" 00:11:01.563 ], 00:11:01.563 "product_name": "Malloc disk", 00:11:01.563 "block_size": 512, 00:11:01.563 "num_blocks": 65536, 00:11:01.563 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:01.563 "assigned_rate_limits": { 00:11:01.563 "rw_ios_per_sec": 0, 00:11:01.563 "rw_mbytes_per_sec": 0, 00:11:01.563 "r_mbytes_per_sec": 0, 00:11:01.563 "w_mbytes_per_sec": 0 00:11:01.563 }, 00:11:01.563 "claimed": true, 00:11:01.563 "claim_type": "exclusive_write", 00:11:01.563 "zoned": false, 00:11:01.563 "supported_io_types": { 00:11:01.563 "read": true, 00:11:01.563 "write": true, 00:11:01.563 "unmap": true, 00:11:01.563 "flush": true, 00:11:01.563 "reset": true, 00:11:01.563 "nvme_admin": false, 00:11:01.563 "nvme_io": false, 00:11:01.563 "nvme_io_md": false, 00:11:01.563 "write_zeroes": true, 00:11:01.563 "zcopy": true, 00:11:01.563 "get_zone_info": false, 00:11:01.563 "zone_management": false, 00:11:01.563 "zone_append": false, 00:11:01.563 "compare": false, 00:11:01.563 "compare_and_write": false, 00:11:01.563 "abort": true, 00:11:01.563 "seek_hole": false, 00:11:01.563 "seek_data": false, 00:11:01.563 "copy": true, 00:11:01.563 "nvme_iov_md": false 00:11:01.563 }, 00:11:01.563 "memory_domains": [ 00:11:01.563 { 00:11:01.563 "dma_device_id": "system", 00:11:01.563 "dma_device_type": 1 00:11:01.563 }, 00:11:01.563 { 00:11:01.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.563 "dma_device_type": 2 00:11:01.563 } 00:11:01.563 ], 00:11:01.563 "driver_specific": {} 00:11:01.563 } 00:11:01.563 ] 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.563 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.821 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.821 "name": "Existed_Raid", 00:11:01.821 "uuid": "80b27ddb-6417-4989-9a55-5c4fbf4d5961", 00:11:01.821 "strip_size_kb": 64, 00:11:01.821 "state": "configuring", 00:11:01.821 "raid_level": "raid0", 00:11:01.821 "superblock": true, 00:11:01.821 "num_base_bdevs": 2, 00:11:01.821 "num_base_bdevs_discovered": 1, 00:11:01.821 "num_base_bdevs_operational": 2, 00:11:01.821 "base_bdevs_list": [ 00:11:01.821 { 00:11:01.821 "name": "BaseBdev1", 00:11:01.821 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:01.821 "is_configured": true, 00:11:01.821 "data_offset": 2048, 00:11:01.821 "data_size": 63488 00:11:01.821 }, 00:11:01.821 { 00:11:01.821 "name": "BaseBdev2", 00:11:01.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.821 "is_configured": false, 00:11:01.821 "data_offset": 0, 00:11:01.821 "data_size": 0 00:11:01.821 } 00:11:01.821 ] 00:11:01.821 }' 00:11:01.821 09:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.821 09:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.388 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:02.646 [2024-07-15 09:15:11.373052] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:02.646 [2024-07-15 09:15:11.373090] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d71350 name Existed_Raid, state configuring 00:11:02.646 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:02.904 [2024-07-15 09:15:11.617735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:02.904 [2024-07-15 09:15:11.619224] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:02.904 [2024-07-15 09:15:11.619257] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:02.904 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:02.905 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:02.905 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:02.905 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.905 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.162 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.162 "name": "Existed_Raid", 00:11:03.162 "uuid": "217b2123-40be-4996-8a5c-95c792954f74", 00:11:03.162 "strip_size_kb": 64, 00:11:03.162 "state": "configuring", 00:11:03.162 "raid_level": "raid0", 00:11:03.162 "superblock": true, 00:11:03.162 "num_base_bdevs": 2, 00:11:03.162 "num_base_bdevs_discovered": 1, 00:11:03.162 "num_base_bdevs_operational": 2, 00:11:03.162 "base_bdevs_list": [ 00:11:03.162 { 00:11:03.162 "name": "BaseBdev1", 00:11:03.162 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:03.162 "is_configured": true, 00:11:03.162 "data_offset": 2048, 00:11:03.162 "data_size": 63488 00:11:03.162 }, 00:11:03.162 { 00:11:03.162 "name": "BaseBdev2", 00:11:03.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.162 "is_configured": false, 00:11:03.162 "data_offset": 0, 00:11:03.162 "data_size": 0 00:11:03.162 } 00:11:03.162 ] 00:11:03.162 }' 00:11:03.162 09:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.162 09:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:04.096 [2024-07-15 09:15:12.965993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:04.096 [2024-07-15 09:15:12.966150] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d72000 00:11:04.096 [2024-07-15 09:15:12.966163] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:04.096 [2024-07-15 09:15:12.966334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c8c0c0 00:11:04.096 [2024-07-15 09:15:12.966450] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d72000 00:11:04.096 [2024-07-15 09:15:12.966460] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d72000 00:11:04.096 [2024-07-15 09:15:12.966552] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.096 BaseBdev2 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:04.096 09:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.354 09:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:04.611 [ 00:11:04.611 { 00:11:04.611 "name": "BaseBdev2", 00:11:04.611 "aliases": [ 00:11:04.611 "21b85b22-24bd-42e3-baa8-fb98891482d9" 00:11:04.611 ], 00:11:04.611 "product_name": "Malloc disk", 00:11:04.611 "block_size": 512, 00:11:04.611 "num_blocks": 65536, 00:11:04.611 "uuid": "21b85b22-24bd-42e3-baa8-fb98891482d9", 00:11:04.611 "assigned_rate_limits": { 00:11:04.611 "rw_ios_per_sec": 0, 00:11:04.611 "rw_mbytes_per_sec": 0, 00:11:04.611 "r_mbytes_per_sec": 0, 00:11:04.611 "w_mbytes_per_sec": 0 00:11:04.611 }, 00:11:04.611 "claimed": true, 00:11:04.611 "claim_type": "exclusive_write", 00:11:04.611 "zoned": false, 00:11:04.611 "supported_io_types": { 00:11:04.611 "read": true, 00:11:04.611 "write": true, 00:11:04.611 "unmap": true, 00:11:04.611 "flush": true, 00:11:04.611 "reset": true, 00:11:04.611 "nvme_admin": false, 00:11:04.611 "nvme_io": false, 00:11:04.611 "nvme_io_md": false, 00:11:04.611 "write_zeroes": true, 00:11:04.611 "zcopy": true, 00:11:04.611 "get_zone_info": false, 00:11:04.611 "zone_management": false, 00:11:04.611 "zone_append": false, 00:11:04.611 "compare": false, 00:11:04.611 "compare_and_write": false, 00:11:04.611 "abort": true, 00:11:04.611 "seek_hole": false, 00:11:04.611 "seek_data": false, 00:11:04.611 "copy": true, 00:11:04.611 "nvme_iov_md": false 00:11:04.611 }, 00:11:04.611 "memory_domains": [ 00:11:04.611 { 00:11:04.611 "dma_device_id": "system", 00:11:04.611 "dma_device_type": 1 00:11:04.611 }, 00:11:04.611 { 00:11:04.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.611 "dma_device_type": 2 00:11:04.611 } 00:11:04.611 ], 00:11:04.611 "driver_specific": {} 00:11:04.611 } 00:11:04.611 ] 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.611 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.868 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.868 "name": "Existed_Raid", 00:11:04.868 "uuid": "217b2123-40be-4996-8a5c-95c792954f74", 00:11:04.868 "strip_size_kb": 64, 00:11:04.868 "state": "online", 00:11:04.868 "raid_level": "raid0", 00:11:04.868 "superblock": true, 00:11:04.868 "num_base_bdevs": 2, 00:11:04.868 "num_base_bdevs_discovered": 2, 00:11:04.868 "num_base_bdevs_operational": 2, 00:11:04.868 "base_bdevs_list": [ 00:11:04.868 { 00:11:04.868 "name": "BaseBdev1", 00:11:04.868 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:04.868 "is_configured": true, 00:11:04.868 "data_offset": 2048, 00:11:04.868 "data_size": 63488 00:11:04.868 }, 00:11:04.868 { 00:11:04.868 "name": "BaseBdev2", 00:11:04.868 "uuid": "21b85b22-24bd-42e3-baa8-fb98891482d9", 00:11:04.868 "is_configured": true, 00:11:04.868 "data_offset": 2048, 00:11:04.868 "data_size": 63488 00:11:04.868 } 00:11:04.868 ] 00:11:04.868 }' 00:11:04.868 09:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.868 09:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:05.433 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:05.692 [2024-07-15 09:15:14.538440] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:05.692 "name": "Existed_Raid", 00:11:05.692 "aliases": [ 00:11:05.692 "217b2123-40be-4996-8a5c-95c792954f74" 00:11:05.692 ], 00:11:05.692 "product_name": "Raid Volume", 00:11:05.692 "block_size": 512, 00:11:05.692 "num_blocks": 126976, 00:11:05.692 "uuid": "217b2123-40be-4996-8a5c-95c792954f74", 00:11:05.692 "assigned_rate_limits": { 00:11:05.692 "rw_ios_per_sec": 0, 00:11:05.692 "rw_mbytes_per_sec": 0, 00:11:05.692 "r_mbytes_per_sec": 0, 00:11:05.692 "w_mbytes_per_sec": 0 00:11:05.692 }, 00:11:05.692 "claimed": false, 00:11:05.692 "zoned": false, 00:11:05.692 "supported_io_types": { 00:11:05.692 "read": true, 00:11:05.692 "write": true, 00:11:05.692 "unmap": true, 00:11:05.692 "flush": true, 00:11:05.692 "reset": true, 00:11:05.692 "nvme_admin": false, 00:11:05.692 "nvme_io": false, 00:11:05.692 "nvme_io_md": false, 00:11:05.692 "write_zeroes": true, 00:11:05.692 "zcopy": false, 00:11:05.692 "get_zone_info": false, 00:11:05.692 "zone_management": false, 00:11:05.692 "zone_append": false, 00:11:05.692 "compare": false, 00:11:05.692 "compare_and_write": false, 00:11:05.692 "abort": false, 00:11:05.692 "seek_hole": false, 00:11:05.692 "seek_data": false, 00:11:05.692 "copy": false, 00:11:05.692 "nvme_iov_md": false 00:11:05.692 }, 00:11:05.692 "memory_domains": [ 00:11:05.692 { 00:11:05.692 "dma_device_id": "system", 00:11:05.692 "dma_device_type": 1 00:11:05.692 }, 00:11:05.692 { 00:11:05.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.692 "dma_device_type": 2 00:11:05.692 }, 00:11:05.692 { 00:11:05.692 "dma_device_id": "system", 00:11:05.692 "dma_device_type": 1 00:11:05.692 }, 00:11:05.692 { 00:11:05.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.692 "dma_device_type": 2 00:11:05.692 } 00:11:05.692 ], 00:11:05.692 "driver_specific": { 00:11:05.692 "raid": { 00:11:05.692 "uuid": "217b2123-40be-4996-8a5c-95c792954f74", 00:11:05.692 "strip_size_kb": 64, 00:11:05.692 "state": "online", 00:11:05.692 "raid_level": "raid0", 00:11:05.692 "superblock": true, 00:11:05.692 "num_base_bdevs": 2, 00:11:05.692 "num_base_bdevs_discovered": 2, 00:11:05.692 "num_base_bdevs_operational": 2, 00:11:05.692 "base_bdevs_list": [ 00:11:05.692 { 00:11:05.692 "name": "BaseBdev1", 00:11:05.692 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:05.692 "is_configured": true, 00:11:05.692 "data_offset": 2048, 00:11:05.692 "data_size": 63488 00:11:05.692 }, 00:11:05.692 { 00:11:05.692 "name": "BaseBdev2", 00:11:05.692 "uuid": "21b85b22-24bd-42e3-baa8-fb98891482d9", 00:11:05.692 "is_configured": true, 00:11:05.692 "data_offset": 2048, 00:11:05.692 "data_size": 63488 00:11:05.692 } 00:11:05.692 ] 00:11:05.692 } 00:11:05.692 } 00:11:05.692 }' 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:05.692 BaseBdev2' 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:05.692 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.058 "name": "BaseBdev1", 00:11:06.058 "aliases": [ 00:11:06.058 "23b08708-9a7c-4d3b-94b6-c8acd3932f71" 00:11:06.058 ], 00:11:06.058 "product_name": "Malloc disk", 00:11:06.058 "block_size": 512, 00:11:06.058 "num_blocks": 65536, 00:11:06.058 "uuid": "23b08708-9a7c-4d3b-94b6-c8acd3932f71", 00:11:06.058 "assigned_rate_limits": { 00:11:06.058 "rw_ios_per_sec": 0, 00:11:06.058 "rw_mbytes_per_sec": 0, 00:11:06.058 "r_mbytes_per_sec": 0, 00:11:06.058 "w_mbytes_per_sec": 0 00:11:06.058 }, 00:11:06.058 "claimed": true, 00:11:06.058 "claim_type": "exclusive_write", 00:11:06.058 "zoned": false, 00:11:06.058 "supported_io_types": { 00:11:06.058 "read": true, 00:11:06.058 "write": true, 00:11:06.058 "unmap": true, 00:11:06.058 "flush": true, 00:11:06.058 "reset": true, 00:11:06.058 "nvme_admin": false, 00:11:06.058 "nvme_io": false, 00:11:06.058 "nvme_io_md": false, 00:11:06.058 "write_zeroes": true, 00:11:06.058 "zcopy": true, 00:11:06.058 "get_zone_info": false, 00:11:06.058 "zone_management": false, 00:11:06.058 "zone_append": false, 00:11:06.058 "compare": false, 00:11:06.058 "compare_and_write": false, 00:11:06.058 "abort": true, 00:11:06.058 "seek_hole": false, 00:11:06.058 "seek_data": false, 00:11:06.058 "copy": true, 00:11:06.058 "nvme_iov_md": false 00:11:06.058 }, 00:11:06.058 "memory_domains": [ 00:11:06.058 { 00:11:06.058 "dma_device_id": "system", 00:11:06.058 "dma_device_type": 1 00:11:06.058 }, 00:11:06.058 { 00:11:06.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.058 "dma_device_type": 2 00:11:06.058 } 00:11:06.058 ], 00:11:06.058 "driver_specific": {} 00:11:06.058 }' 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.058 09:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:06.316 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.574 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.574 "name": "BaseBdev2", 00:11:06.574 "aliases": [ 00:11:06.574 "21b85b22-24bd-42e3-baa8-fb98891482d9" 00:11:06.574 ], 00:11:06.574 "product_name": "Malloc disk", 00:11:06.574 "block_size": 512, 00:11:06.574 "num_blocks": 65536, 00:11:06.574 "uuid": "21b85b22-24bd-42e3-baa8-fb98891482d9", 00:11:06.574 "assigned_rate_limits": { 00:11:06.574 "rw_ios_per_sec": 0, 00:11:06.574 "rw_mbytes_per_sec": 0, 00:11:06.574 "r_mbytes_per_sec": 0, 00:11:06.574 "w_mbytes_per_sec": 0 00:11:06.574 }, 00:11:06.574 "claimed": true, 00:11:06.574 "claim_type": "exclusive_write", 00:11:06.574 "zoned": false, 00:11:06.574 "supported_io_types": { 00:11:06.574 "read": true, 00:11:06.574 "write": true, 00:11:06.574 "unmap": true, 00:11:06.574 "flush": true, 00:11:06.574 "reset": true, 00:11:06.574 "nvme_admin": false, 00:11:06.574 "nvme_io": false, 00:11:06.575 "nvme_io_md": false, 00:11:06.575 "write_zeroes": true, 00:11:06.575 "zcopy": true, 00:11:06.575 "get_zone_info": false, 00:11:06.575 "zone_management": false, 00:11:06.575 "zone_append": false, 00:11:06.575 "compare": false, 00:11:06.575 "compare_and_write": false, 00:11:06.575 "abort": true, 00:11:06.575 "seek_hole": false, 00:11:06.575 "seek_data": false, 00:11:06.575 "copy": true, 00:11:06.575 "nvme_iov_md": false 00:11:06.575 }, 00:11:06.575 "memory_domains": [ 00:11:06.575 { 00:11:06.575 "dma_device_id": "system", 00:11:06.575 "dma_device_type": 1 00:11:06.575 }, 00:11:06.575 { 00:11:06.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.575 "dma_device_type": 2 00:11:06.575 } 00:11:06.575 ], 00:11:06.575 "driver_specific": {} 00:11:06.575 }' 00:11:06.575 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.575 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.832 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:07.090 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:07.090 09:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:07.090 [2024-07-15 09:15:16.030184] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:07.090 [2024-07-15 09:15:16.030213] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.090 [2024-07-15 09:15:16.030256] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.347 "name": "Existed_Raid", 00:11:07.347 "uuid": "217b2123-40be-4996-8a5c-95c792954f74", 00:11:07.347 "strip_size_kb": 64, 00:11:07.347 "state": "offline", 00:11:07.347 "raid_level": "raid0", 00:11:07.347 "superblock": true, 00:11:07.347 "num_base_bdevs": 2, 00:11:07.347 "num_base_bdevs_discovered": 1, 00:11:07.347 "num_base_bdevs_operational": 1, 00:11:07.347 "base_bdevs_list": [ 00:11:07.347 { 00:11:07.347 "name": null, 00:11:07.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.347 "is_configured": false, 00:11:07.347 "data_offset": 2048, 00:11:07.347 "data_size": 63488 00:11:07.347 }, 00:11:07.347 { 00:11:07.347 "name": "BaseBdev2", 00:11:07.347 "uuid": "21b85b22-24bd-42e3-baa8-fb98891482d9", 00:11:07.347 "is_configured": true, 00:11:07.347 "data_offset": 2048, 00:11:07.347 "data_size": 63488 00:11:07.347 } 00:11:07.347 ] 00:11:07.347 }' 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.347 09:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:08.280 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:08.280 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:08.280 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.280 09:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:08.280 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:08.280 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:08.280 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:08.844 [2024-07-15 09:15:17.632357] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:08.844 [2024-07-15 09:15:17.632411] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d72000 name Existed_Raid, state offline 00:11:08.844 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:08.844 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:08.844 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.844 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 84950 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 84950 ']' 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 84950 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:09.103 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 84950 00:11:09.104 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:09.104 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:09.104 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 84950' 00:11:09.104 killing process with pid 84950 00:11:09.104 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 84950 00:11:09.104 [2024-07-15 09:15:17.973672] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:09.104 09:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 84950 00:11:09.104 [2024-07-15 09:15:17.974621] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:09.361 09:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:09.361 00:11:09.361 real 0m11.182s 00:11:09.361 user 0m19.931s 00:11:09.361 sys 0m2.067s 00:11:09.361 09:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:09.361 09:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:09.361 ************************************ 00:11:09.362 END TEST raid_state_function_test_sb 00:11:09.362 ************************************ 00:11:09.362 09:15:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:09.362 09:15:18 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:09.362 09:15:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:09.362 09:15:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:09.362 09:15:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:09.362 ************************************ 00:11:09.362 START TEST raid_superblock_test 00:11:09.362 ************************************ 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=86749 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 86749 /var/tmp/spdk-raid.sock 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 86749 ']' 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:09.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:09.362 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.620 [2024-07-15 09:15:18.325391] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:09.620 [2024-07-15 09:15:18.325452] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86749 ] 00:11:09.620 [2024-07-15 09:15:18.451676] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:09.620 [2024-07-15 09:15:18.558717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:09.878 [2024-07-15 09:15:18.625294] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.878 [2024-07-15 09:15:18.625324] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:09.878 09:15:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:10.135 malloc1 00:11:10.135 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:10.699 [2024-07-15 09:15:19.515209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:10.699 [2024-07-15 09:15:19.515261] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:10.699 [2024-07-15 09:15:19.515282] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc17570 00:11:10.699 [2024-07-15 09:15:19.515294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:10.699 [2024-07-15 09:15:19.517046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:10.699 [2024-07-15 09:15:19.517075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:10.699 pt1 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:10.699 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:10.956 malloc2 00:11:10.956 09:15:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:11.520 [2024-07-15 09:15:20.294070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:11.520 [2024-07-15 09:15:20.294125] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:11.520 [2024-07-15 09:15:20.294144] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc18970 00:11:11.520 [2024-07-15 09:15:20.294157] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:11.520 [2024-07-15 09:15:20.295869] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:11.520 [2024-07-15 09:15:20.295897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:11.520 pt2 00:11:11.520 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:11.520 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:11.520 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:11.778 [2024-07-15 09:15:20.550765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:11.778 [2024-07-15 09:15:20.552165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:11.778 [2024-07-15 09:15:20.552312] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdbb270 00:11:11.778 [2024-07-15 09:15:20.552325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:11.778 [2024-07-15 09:15:20.552528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdb0c10 00:11:11.778 [2024-07-15 09:15:20.552676] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdbb270 00:11:11.778 [2024-07-15 09:15:20.552686] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdbb270 00:11:11.778 [2024-07-15 09:15:20.552792] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:11.778 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:11.778 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:11.778 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:11.778 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.779 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:12.036 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.036 "name": "raid_bdev1", 00:11:12.036 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:12.036 "strip_size_kb": 64, 00:11:12.036 "state": "online", 00:11:12.036 "raid_level": "raid0", 00:11:12.036 "superblock": true, 00:11:12.036 "num_base_bdevs": 2, 00:11:12.036 "num_base_bdevs_discovered": 2, 00:11:12.036 "num_base_bdevs_operational": 2, 00:11:12.036 "base_bdevs_list": [ 00:11:12.036 { 00:11:12.036 "name": "pt1", 00:11:12.036 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:12.036 "is_configured": true, 00:11:12.037 "data_offset": 2048, 00:11:12.037 "data_size": 63488 00:11:12.037 }, 00:11:12.037 { 00:11:12.037 "name": "pt2", 00:11:12.037 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:12.037 "is_configured": true, 00:11:12.037 "data_offset": 2048, 00:11:12.037 "data_size": 63488 00:11:12.037 } 00:11:12.037 ] 00:11:12.037 }' 00:11:12.037 09:15:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.037 09:15:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:12.602 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:12.859 [2024-07-15 09:15:21.617812] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:12.859 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:12.859 "name": "raid_bdev1", 00:11:12.859 "aliases": [ 00:11:12.859 "626c4a12-ec92-46bb-b59a-ca50d658e416" 00:11:12.859 ], 00:11:12.859 "product_name": "Raid Volume", 00:11:12.859 "block_size": 512, 00:11:12.859 "num_blocks": 126976, 00:11:12.859 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:12.859 "assigned_rate_limits": { 00:11:12.859 "rw_ios_per_sec": 0, 00:11:12.859 "rw_mbytes_per_sec": 0, 00:11:12.859 "r_mbytes_per_sec": 0, 00:11:12.859 "w_mbytes_per_sec": 0 00:11:12.859 }, 00:11:12.859 "claimed": false, 00:11:12.859 "zoned": false, 00:11:12.859 "supported_io_types": { 00:11:12.859 "read": true, 00:11:12.859 "write": true, 00:11:12.859 "unmap": true, 00:11:12.859 "flush": true, 00:11:12.859 "reset": true, 00:11:12.859 "nvme_admin": false, 00:11:12.859 "nvme_io": false, 00:11:12.859 "nvme_io_md": false, 00:11:12.859 "write_zeroes": true, 00:11:12.859 "zcopy": false, 00:11:12.859 "get_zone_info": false, 00:11:12.859 "zone_management": false, 00:11:12.859 "zone_append": false, 00:11:12.859 "compare": false, 00:11:12.859 "compare_and_write": false, 00:11:12.859 "abort": false, 00:11:12.859 "seek_hole": false, 00:11:12.859 "seek_data": false, 00:11:12.859 "copy": false, 00:11:12.859 "nvme_iov_md": false 00:11:12.859 }, 00:11:12.859 "memory_domains": [ 00:11:12.859 { 00:11:12.859 "dma_device_id": "system", 00:11:12.859 "dma_device_type": 1 00:11:12.859 }, 00:11:12.859 { 00:11:12.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.859 "dma_device_type": 2 00:11:12.859 }, 00:11:12.859 { 00:11:12.859 "dma_device_id": "system", 00:11:12.859 "dma_device_type": 1 00:11:12.859 }, 00:11:12.859 { 00:11:12.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.859 "dma_device_type": 2 00:11:12.859 } 00:11:12.859 ], 00:11:12.859 "driver_specific": { 00:11:12.859 "raid": { 00:11:12.859 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:12.859 "strip_size_kb": 64, 00:11:12.859 "state": "online", 00:11:12.859 "raid_level": "raid0", 00:11:12.859 "superblock": true, 00:11:12.859 "num_base_bdevs": 2, 00:11:12.859 "num_base_bdevs_discovered": 2, 00:11:12.859 "num_base_bdevs_operational": 2, 00:11:12.859 "base_bdevs_list": [ 00:11:12.859 { 00:11:12.859 "name": "pt1", 00:11:12.859 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:12.859 "is_configured": true, 00:11:12.859 "data_offset": 2048, 00:11:12.859 "data_size": 63488 00:11:12.859 }, 00:11:12.859 { 00:11:12.859 "name": "pt2", 00:11:12.859 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:12.859 "is_configured": true, 00:11:12.859 "data_offset": 2048, 00:11:12.859 "data_size": 63488 00:11:12.859 } 00:11:12.859 ] 00:11:12.859 } 00:11:12.859 } 00:11:12.859 }' 00:11:12.859 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:12.859 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:12.859 pt2' 00:11:12.859 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:12.859 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:12.860 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:13.117 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:13.117 "name": "pt1", 00:11:13.117 "aliases": [ 00:11:13.117 "00000000-0000-0000-0000-000000000001" 00:11:13.117 ], 00:11:13.117 "product_name": "passthru", 00:11:13.117 "block_size": 512, 00:11:13.117 "num_blocks": 65536, 00:11:13.117 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:13.117 "assigned_rate_limits": { 00:11:13.117 "rw_ios_per_sec": 0, 00:11:13.117 "rw_mbytes_per_sec": 0, 00:11:13.117 "r_mbytes_per_sec": 0, 00:11:13.117 "w_mbytes_per_sec": 0 00:11:13.117 }, 00:11:13.117 "claimed": true, 00:11:13.117 "claim_type": "exclusive_write", 00:11:13.117 "zoned": false, 00:11:13.117 "supported_io_types": { 00:11:13.117 "read": true, 00:11:13.117 "write": true, 00:11:13.117 "unmap": true, 00:11:13.117 "flush": true, 00:11:13.117 "reset": true, 00:11:13.117 "nvme_admin": false, 00:11:13.117 "nvme_io": false, 00:11:13.117 "nvme_io_md": false, 00:11:13.117 "write_zeroes": true, 00:11:13.117 "zcopy": true, 00:11:13.117 "get_zone_info": false, 00:11:13.117 "zone_management": false, 00:11:13.117 "zone_append": false, 00:11:13.117 "compare": false, 00:11:13.117 "compare_and_write": false, 00:11:13.117 "abort": true, 00:11:13.117 "seek_hole": false, 00:11:13.117 "seek_data": false, 00:11:13.117 "copy": true, 00:11:13.117 "nvme_iov_md": false 00:11:13.117 }, 00:11:13.117 "memory_domains": [ 00:11:13.117 { 00:11:13.117 "dma_device_id": "system", 00:11:13.117 "dma_device_type": 1 00:11:13.117 }, 00:11:13.117 { 00:11:13.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.117 "dma_device_type": 2 00:11:13.117 } 00:11:13.117 ], 00:11:13.117 "driver_specific": { 00:11:13.117 "passthru": { 00:11:13.117 "name": "pt1", 00:11:13.117 "base_bdev_name": "malloc1" 00:11:13.117 } 00:11:13.117 } 00:11:13.117 }' 00:11:13.117 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.117 09:15:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.117 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:13.117 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.117 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:13.375 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:13.633 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:13.633 "name": "pt2", 00:11:13.633 "aliases": [ 00:11:13.633 "00000000-0000-0000-0000-000000000002" 00:11:13.633 ], 00:11:13.633 "product_name": "passthru", 00:11:13.633 "block_size": 512, 00:11:13.633 "num_blocks": 65536, 00:11:13.633 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:13.633 "assigned_rate_limits": { 00:11:13.633 "rw_ios_per_sec": 0, 00:11:13.633 "rw_mbytes_per_sec": 0, 00:11:13.633 "r_mbytes_per_sec": 0, 00:11:13.633 "w_mbytes_per_sec": 0 00:11:13.633 }, 00:11:13.633 "claimed": true, 00:11:13.633 "claim_type": "exclusive_write", 00:11:13.633 "zoned": false, 00:11:13.633 "supported_io_types": { 00:11:13.633 "read": true, 00:11:13.633 "write": true, 00:11:13.633 "unmap": true, 00:11:13.633 "flush": true, 00:11:13.633 "reset": true, 00:11:13.633 "nvme_admin": false, 00:11:13.633 "nvme_io": false, 00:11:13.633 "nvme_io_md": false, 00:11:13.633 "write_zeroes": true, 00:11:13.633 "zcopy": true, 00:11:13.633 "get_zone_info": false, 00:11:13.633 "zone_management": false, 00:11:13.633 "zone_append": false, 00:11:13.633 "compare": false, 00:11:13.633 "compare_and_write": false, 00:11:13.633 "abort": true, 00:11:13.633 "seek_hole": false, 00:11:13.633 "seek_data": false, 00:11:13.633 "copy": true, 00:11:13.633 "nvme_iov_md": false 00:11:13.633 }, 00:11:13.633 "memory_domains": [ 00:11:13.633 { 00:11:13.633 "dma_device_id": "system", 00:11:13.633 "dma_device_type": 1 00:11:13.633 }, 00:11:13.633 { 00:11:13.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.633 "dma_device_type": 2 00:11:13.633 } 00:11:13.633 ], 00:11:13.633 "driver_specific": { 00:11:13.633 "passthru": { 00:11:13.633 "name": "pt2", 00:11:13.633 "base_bdev_name": "malloc2" 00:11:13.633 } 00:11:13.633 } 00:11:13.633 }' 00:11:13.633 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.633 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:13.891 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.147 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.147 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:14.147 09:15:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:14.147 [2024-07-15 09:15:23.093692] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:14.404 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=626c4a12-ec92-46bb-b59a-ca50d658e416 00:11:14.404 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 626c4a12-ec92-46bb-b59a-ca50d658e416 ']' 00:11:14.404 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:14.404 [2024-07-15 09:15:23.338246] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:14.405 [2024-07-15 09:15:23.338274] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:14.405 [2024-07-15 09:15:23.338331] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:14.405 [2024-07-15 09:15:23.338376] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:14.405 [2024-07-15 09:15:23.338389] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdbb270 name raid_bdev1, state offline 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:14.662 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:14.920 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:14.920 09:15:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:15.485 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:15.485 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:15.743 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:16.001 [2024-07-15 09:15:24.838136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:16.001 [2024-07-15 09:15:24.839548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:16.001 [2024-07-15 09:15:24.839604] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:16.001 [2024-07-15 09:15:24.839645] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:16.001 [2024-07-15 09:15:24.839664] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:16.001 [2024-07-15 09:15:24.839674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdbaff0 name raid_bdev1, state configuring 00:11:16.001 request: 00:11:16.001 { 00:11:16.001 "name": "raid_bdev1", 00:11:16.001 "raid_level": "raid0", 00:11:16.001 "base_bdevs": [ 00:11:16.001 "malloc1", 00:11:16.001 "malloc2" 00:11:16.001 ], 00:11:16.001 "strip_size_kb": 64, 00:11:16.001 "superblock": false, 00:11:16.001 "method": "bdev_raid_create", 00:11:16.001 "req_id": 1 00:11:16.001 } 00:11:16.001 Got JSON-RPC error response 00:11:16.001 response: 00:11:16.001 { 00:11:16.001 "code": -17, 00:11:16.001 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:16.001 } 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.001 09:15:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:16.259 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:16.259 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:16.259 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:16.518 [2024-07-15 09:15:25.327377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:16.518 [2024-07-15 09:15:25.327425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:16.518 [2024-07-15 09:15:25.327447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc177a0 00:11:16.518 [2024-07-15 09:15:25.327459] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:16.518 [2024-07-15 09:15:25.329101] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:16.518 [2024-07-15 09:15:25.329129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:16.518 [2024-07-15 09:15:25.329199] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:16.518 [2024-07-15 09:15:25.329224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:16.518 pt1 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.518 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:16.776 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.776 "name": "raid_bdev1", 00:11:16.776 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:16.776 "strip_size_kb": 64, 00:11:16.776 "state": "configuring", 00:11:16.776 "raid_level": "raid0", 00:11:16.776 "superblock": true, 00:11:16.776 "num_base_bdevs": 2, 00:11:16.776 "num_base_bdevs_discovered": 1, 00:11:16.776 "num_base_bdevs_operational": 2, 00:11:16.776 "base_bdevs_list": [ 00:11:16.776 { 00:11:16.776 "name": "pt1", 00:11:16.776 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:16.776 "is_configured": true, 00:11:16.776 "data_offset": 2048, 00:11:16.776 "data_size": 63488 00:11:16.776 }, 00:11:16.776 { 00:11:16.776 "name": null, 00:11:16.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:16.776 "is_configured": false, 00:11:16.776 "data_offset": 2048, 00:11:16.776 "data_size": 63488 00:11:16.776 } 00:11:16.776 ] 00:11:16.776 }' 00:11:16.776 09:15:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.776 09:15:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.342 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:17.342 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:17.342 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:17.342 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:17.600 [2024-07-15 09:15:26.414270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:17.600 [2024-07-15 09:15:26.414318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:17.600 [2024-07-15 09:15:26.414336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdb1820 00:11:17.600 [2024-07-15 09:15:26.414349] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:17.600 [2024-07-15 09:15:26.414688] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:17.600 [2024-07-15 09:15:26.414706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:17.600 [2024-07-15 09:15:26.414768] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:17.600 [2024-07-15 09:15:26.414786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:17.600 [2024-07-15 09:15:26.414881] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc0dec0 00:11:17.600 [2024-07-15 09:15:26.414891] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:17.600 [2024-07-15 09:15:26.415086] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc10530 00:11:17.600 [2024-07-15 09:15:26.415211] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc0dec0 00:11:17.600 [2024-07-15 09:15:26.415221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc0dec0 00:11:17.600 [2024-07-15 09:15:26.415322] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:17.600 pt2 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.600 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:17.858 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.858 "name": "raid_bdev1", 00:11:17.858 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:17.858 "strip_size_kb": 64, 00:11:17.858 "state": "online", 00:11:17.858 "raid_level": "raid0", 00:11:17.858 "superblock": true, 00:11:17.858 "num_base_bdevs": 2, 00:11:17.858 "num_base_bdevs_discovered": 2, 00:11:17.858 "num_base_bdevs_operational": 2, 00:11:17.858 "base_bdevs_list": [ 00:11:17.858 { 00:11:17.858 "name": "pt1", 00:11:17.858 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:17.858 "is_configured": true, 00:11:17.858 "data_offset": 2048, 00:11:17.858 "data_size": 63488 00:11:17.858 }, 00:11:17.858 { 00:11:17.858 "name": "pt2", 00:11:17.858 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:17.858 "is_configured": true, 00:11:17.858 "data_offset": 2048, 00:11:17.858 "data_size": 63488 00:11:17.858 } 00:11:17.858 ] 00:11:17.858 }' 00:11:17.858 09:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.858 09:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:18.422 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:18.680 [2024-07-15 09:15:27.493377] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:18.680 "name": "raid_bdev1", 00:11:18.680 "aliases": [ 00:11:18.680 "626c4a12-ec92-46bb-b59a-ca50d658e416" 00:11:18.680 ], 00:11:18.680 "product_name": "Raid Volume", 00:11:18.680 "block_size": 512, 00:11:18.680 "num_blocks": 126976, 00:11:18.680 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:18.680 "assigned_rate_limits": { 00:11:18.680 "rw_ios_per_sec": 0, 00:11:18.680 "rw_mbytes_per_sec": 0, 00:11:18.680 "r_mbytes_per_sec": 0, 00:11:18.680 "w_mbytes_per_sec": 0 00:11:18.680 }, 00:11:18.680 "claimed": false, 00:11:18.680 "zoned": false, 00:11:18.680 "supported_io_types": { 00:11:18.680 "read": true, 00:11:18.680 "write": true, 00:11:18.680 "unmap": true, 00:11:18.680 "flush": true, 00:11:18.680 "reset": true, 00:11:18.680 "nvme_admin": false, 00:11:18.680 "nvme_io": false, 00:11:18.680 "nvme_io_md": false, 00:11:18.680 "write_zeroes": true, 00:11:18.680 "zcopy": false, 00:11:18.680 "get_zone_info": false, 00:11:18.680 "zone_management": false, 00:11:18.680 "zone_append": false, 00:11:18.680 "compare": false, 00:11:18.680 "compare_and_write": false, 00:11:18.680 "abort": false, 00:11:18.680 "seek_hole": false, 00:11:18.680 "seek_data": false, 00:11:18.680 "copy": false, 00:11:18.680 "nvme_iov_md": false 00:11:18.680 }, 00:11:18.680 "memory_domains": [ 00:11:18.680 { 00:11:18.680 "dma_device_id": "system", 00:11:18.680 "dma_device_type": 1 00:11:18.680 }, 00:11:18.680 { 00:11:18.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.680 "dma_device_type": 2 00:11:18.680 }, 00:11:18.680 { 00:11:18.680 "dma_device_id": "system", 00:11:18.680 "dma_device_type": 1 00:11:18.680 }, 00:11:18.680 { 00:11:18.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.680 "dma_device_type": 2 00:11:18.680 } 00:11:18.680 ], 00:11:18.680 "driver_specific": { 00:11:18.680 "raid": { 00:11:18.680 "uuid": "626c4a12-ec92-46bb-b59a-ca50d658e416", 00:11:18.680 "strip_size_kb": 64, 00:11:18.680 "state": "online", 00:11:18.680 "raid_level": "raid0", 00:11:18.680 "superblock": true, 00:11:18.680 "num_base_bdevs": 2, 00:11:18.680 "num_base_bdevs_discovered": 2, 00:11:18.680 "num_base_bdevs_operational": 2, 00:11:18.680 "base_bdevs_list": [ 00:11:18.680 { 00:11:18.680 "name": "pt1", 00:11:18.680 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:18.680 "is_configured": true, 00:11:18.680 "data_offset": 2048, 00:11:18.680 "data_size": 63488 00:11:18.680 }, 00:11:18.680 { 00:11:18.680 "name": "pt2", 00:11:18.680 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:18.680 "is_configured": true, 00:11:18.680 "data_offset": 2048, 00:11:18.680 "data_size": 63488 00:11:18.680 } 00:11:18.680 ] 00:11:18.680 } 00:11:18.680 } 00:11:18.680 }' 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:18.680 pt2' 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:18.680 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:18.938 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:18.938 "name": "pt1", 00:11:18.938 "aliases": [ 00:11:18.938 "00000000-0000-0000-0000-000000000001" 00:11:18.938 ], 00:11:18.938 "product_name": "passthru", 00:11:18.938 "block_size": 512, 00:11:18.938 "num_blocks": 65536, 00:11:18.938 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:18.938 "assigned_rate_limits": { 00:11:18.938 "rw_ios_per_sec": 0, 00:11:18.938 "rw_mbytes_per_sec": 0, 00:11:18.938 "r_mbytes_per_sec": 0, 00:11:18.938 "w_mbytes_per_sec": 0 00:11:18.938 }, 00:11:18.938 "claimed": true, 00:11:18.938 "claim_type": "exclusive_write", 00:11:18.938 "zoned": false, 00:11:18.938 "supported_io_types": { 00:11:18.938 "read": true, 00:11:18.938 "write": true, 00:11:18.938 "unmap": true, 00:11:18.938 "flush": true, 00:11:18.938 "reset": true, 00:11:18.938 "nvme_admin": false, 00:11:18.938 "nvme_io": false, 00:11:18.938 "nvme_io_md": false, 00:11:18.938 "write_zeroes": true, 00:11:18.938 "zcopy": true, 00:11:18.938 "get_zone_info": false, 00:11:18.938 "zone_management": false, 00:11:18.938 "zone_append": false, 00:11:18.938 "compare": false, 00:11:18.938 "compare_and_write": false, 00:11:18.938 "abort": true, 00:11:18.938 "seek_hole": false, 00:11:18.938 "seek_data": false, 00:11:18.938 "copy": true, 00:11:18.938 "nvme_iov_md": false 00:11:18.938 }, 00:11:18.938 "memory_domains": [ 00:11:18.938 { 00:11:18.938 "dma_device_id": "system", 00:11:18.938 "dma_device_type": 1 00:11:18.938 }, 00:11:18.938 { 00:11:18.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.938 "dma_device_type": 2 00:11:18.938 } 00:11:18.938 ], 00:11:18.938 "driver_specific": { 00:11:18.938 "passthru": { 00:11:18.938 "name": "pt1", 00:11:18.938 "base_bdev_name": "malloc1" 00:11:18.938 } 00:11:18.938 } 00:11:18.938 }' 00:11:18.938 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.938 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.196 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:19.196 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.196 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.196 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:19.196 09:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.196 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.196 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:19.196 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.196 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.455 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:19.455 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:19.455 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:19.455 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:19.714 "name": "pt2", 00:11:19.714 "aliases": [ 00:11:19.714 "00000000-0000-0000-0000-000000000002" 00:11:19.714 ], 00:11:19.714 "product_name": "passthru", 00:11:19.714 "block_size": 512, 00:11:19.714 "num_blocks": 65536, 00:11:19.714 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:19.714 "assigned_rate_limits": { 00:11:19.714 "rw_ios_per_sec": 0, 00:11:19.714 "rw_mbytes_per_sec": 0, 00:11:19.714 "r_mbytes_per_sec": 0, 00:11:19.714 "w_mbytes_per_sec": 0 00:11:19.714 }, 00:11:19.714 "claimed": true, 00:11:19.714 "claim_type": "exclusive_write", 00:11:19.714 "zoned": false, 00:11:19.714 "supported_io_types": { 00:11:19.714 "read": true, 00:11:19.714 "write": true, 00:11:19.714 "unmap": true, 00:11:19.714 "flush": true, 00:11:19.714 "reset": true, 00:11:19.714 "nvme_admin": false, 00:11:19.714 "nvme_io": false, 00:11:19.714 "nvme_io_md": false, 00:11:19.714 "write_zeroes": true, 00:11:19.714 "zcopy": true, 00:11:19.714 "get_zone_info": false, 00:11:19.714 "zone_management": false, 00:11:19.714 "zone_append": false, 00:11:19.714 "compare": false, 00:11:19.714 "compare_and_write": false, 00:11:19.714 "abort": true, 00:11:19.714 "seek_hole": false, 00:11:19.714 "seek_data": false, 00:11:19.714 "copy": true, 00:11:19.714 "nvme_iov_md": false 00:11:19.714 }, 00:11:19.714 "memory_domains": [ 00:11:19.714 { 00:11:19.714 "dma_device_id": "system", 00:11:19.714 "dma_device_type": 1 00:11:19.714 }, 00:11:19.714 { 00:11:19.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.714 "dma_device_type": 2 00:11:19.714 } 00:11:19.714 ], 00:11:19.714 "driver_specific": { 00:11:19.714 "passthru": { 00:11:19.714 "name": "pt2", 00:11:19.714 "base_bdev_name": "malloc2" 00:11:19.714 } 00:11:19.714 } 00:11:19.714 }' 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.714 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:19.972 09:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:20.230 [2024-07-15 09:15:28.993342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 626c4a12-ec92-46bb-b59a-ca50d658e416 '!=' 626c4a12-ec92-46bb-b59a-ca50d658e416 ']' 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 86749 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 86749 ']' 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 86749 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 86749 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 86749' 00:11:20.230 killing process with pid 86749 00:11:20.230 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 86749 00:11:20.230 [2024-07-15 09:15:29.069533] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:20.230 [2024-07-15 09:15:29.069585] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:20.231 [2024-07-15 09:15:29.069628] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:20.231 [2024-07-15 09:15:29.069639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc0dec0 name raid_bdev1, state offline 00:11:20.231 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 86749 00:11:20.231 [2024-07-15 09:15:29.087461] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:20.489 09:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:20.489 00:11:20.489 real 0m11.051s 00:11:20.489 user 0m20.193s 00:11:20.489 sys 0m2.068s 00:11:20.489 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:20.489 09:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.489 ************************************ 00:11:20.489 END TEST raid_superblock_test 00:11:20.489 ************************************ 00:11:20.489 09:15:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:20.489 09:15:29 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:20.489 09:15:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:20.489 09:15:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:20.489 09:15:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:20.489 ************************************ 00:11:20.489 START TEST raid_read_error_test 00:11:20.489 ************************************ 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2M8F9vDJuR 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=88379 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 88379 /var/tmp/spdk-raid.sock 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 88379 ']' 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:20.489 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:20.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:20.490 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:20.490 09:15:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.747 [2024-07-15 09:15:29.477423] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:20.747 [2024-07-15 09:15:29.477497] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid88379 ] 00:11:20.747 [2024-07-15 09:15:29.608752] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:21.005 [2024-07-15 09:15:29.710740] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:21.005 [2024-07-15 09:15:29.777178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.005 [2024-07-15 09:15:29.777218] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.571 09:15:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:21.571 09:15:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:21.571 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:21.571 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:21.829 BaseBdev1_malloc 00:11:21.829 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:21.829 true 00:11:21.829 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:22.087 [2024-07-15 09:15:30.921313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:22.087 [2024-07-15 09:15:30.921362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.087 [2024-07-15 09:15:30.921381] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219f0d0 00:11:22.087 [2024-07-15 09:15:30.921393] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.087 [2024-07-15 09:15:30.923089] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.087 [2024-07-15 09:15:30.923117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:22.087 BaseBdev1 00:11:22.087 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:22.087 09:15:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:22.345 BaseBdev2_malloc 00:11:22.345 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:22.345 true 00:11:22.345 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:22.604 [2024-07-15 09:15:31.451394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:22.604 [2024-07-15 09:15:31.451444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.604 [2024-07-15 09:15:31.451464] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a3910 00:11:22.604 [2024-07-15 09:15:31.451477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.604 [2024-07-15 09:15:31.452989] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.604 [2024-07-15 09:15:31.453017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:22.604 BaseBdev2 00:11:22.604 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:22.862 [2024-07-15 09:15:31.635910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:22.862 [2024-07-15 09:15:31.637198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:22.862 [2024-07-15 09:15:31.637392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21a5320 00:11:22.862 [2024-07-15 09:15:31.637405] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:22.862 [2024-07-15 09:15:31.637604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a4270 00:11:22.862 [2024-07-15 09:15:31.637748] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21a5320 00:11:22.862 [2024-07-15 09:15:31.637758] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21a5320 00:11:22.863 [2024-07-15 09:15:31.637862] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.863 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:23.121 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.121 "name": "raid_bdev1", 00:11:23.121 "uuid": "e544a760-20cd-4d56-9d5f-b9d6d2b27576", 00:11:23.121 "strip_size_kb": 64, 00:11:23.121 "state": "online", 00:11:23.121 "raid_level": "raid0", 00:11:23.121 "superblock": true, 00:11:23.121 "num_base_bdevs": 2, 00:11:23.121 "num_base_bdevs_discovered": 2, 00:11:23.121 "num_base_bdevs_operational": 2, 00:11:23.121 "base_bdevs_list": [ 00:11:23.121 { 00:11:23.121 "name": "BaseBdev1", 00:11:23.121 "uuid": "42d6e488-7db6-5e11-a82e-bb9f78f189e1", 00:11:23.121 "is_configured": true, 00:11:23.121 "data_offset": 2048, 00:11:23.121 "data_size": 63488 00:11:23.121 }, 00:11:23.121 { 00:11:23.121 "name": "BaseBdev2", 00:11:23.121 "uuid": "41314b71-34ee-5835-9a93-46a164e1d805", 00:11:23.121 "is_configured": true, 00:11:23.121 "data_offset": 2048, 00:11:23.121 "data_size": 63488 00:11:23.121 } 00:11:23.121 ] 00:11:23.121 }' 00:11:23.121 09:15:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.121 09:15:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.687 09:15:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:23.687 09:15:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:23.687 [2024-07-15 09:15:32.582708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a09b0 00:11:24.624 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.882 09:15:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:25.447 09:15:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.447 "name": "raid_bdev1", 00:11:25.447 "uuid": "e544a760-20cd-4d56-9d5f-b9d6d2b27576", 00:11:25.447 "strip_size_kb": 64, 00:11:25.447 "state": "online", 00:11:25.447 "raid_level": "raid0", 00:11:25.447 "superblock": true, 00:11:25.447 "num_base_bdevs": 2, 00:11:25.447 "num_base_bdevs_discovered": 2, 00:11:25.447 "num_base_bdevs_operational": 2, 00:11:25.447 "base_bdevs_list": [ 00:11:25.447 { 00:11:25.447 "name": "BaseBdev1", 00:11:25.447 "uuid": "42d6e488-7db6-5e11-a82e-bb9f78f189e1", 00:11:25.447 "is_configured": true, 00:11:25.447 "data_offset": 2048, 00:11:25.447 "data_size": 63488 00:11:25.447 }, 00:11:25.447 { 00:11:25.447 "name": "BaseBdev2", 00:11:25.447 "uuid": "41314b71-34ee-5835-9a93-46a164e1d805", 00:11:25.447 "is_configured": true, 00:11:25.447 "data_offset": 2048, 00:11:25.447 "data_size": 63488 00:11:25.447 } 00:11:25.447 ] 00:11:25.447 }' 00:11:25.447 09:15:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.447 09:15:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.013 09:15:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:26.271 [2024-07-15 09:15:35.042627] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:26.271 [2024-07-15 09:15:35.042670] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:26.271 [2024-07-15 09:15:35.045907] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:26.271 [2024-07-15 09:15:35.045944] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.271 [2024-07-15 09:15:35.045973] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:26.271 [2024-07-15 09:15:35.045984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a5320 name raid_bdev1, state offline 00:11:26.271 0 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 88379 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 88379 ']' 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 88379 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 88379 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 88379' 00:11:26.271 killing process with pid 88379 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 88379 00:11:26.271 [2024-07-15 09:15:35.110456] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:26.271 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 88379 00:11:26.271 [2024-07-15 09:15:35.121019] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2M8F9vDJuR 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:11:26.530 00:11:26.530 real 0m5.963s 00:11:26.530 user 0m9.211s 00:11:26.530 sys 0m1.065s 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:26.530 09:15:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.530 ************************************ 00:11:26.530 END TEST raid_read_error_test 00:11:26.530 ************************************ 00:11:26.530 09:15:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:26.530 09:15:35 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:26.530 09:15:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:26.530 09:15:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:26.530 09:15:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:26.530 ************************************ 00:11:26.530 START TEST raid_write_error_test 00:11:26.530 ************************************ 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.1oXNGg0NZv 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=89192 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 89192 /var/tmp/spdk-raid.sock 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 89192 ']' 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:26.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:26.530 09:15:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.790 [2024-07-15 09:15:35.514871] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:26.790 [2024-07-15 09:15:35.514947] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89192 ] 00:11:26.790 [2024-07-15 09:15:35.645966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:27.048 [2024-07-15 09:15:35.752243] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:27.048 [2024-07-15 09:15:35.814834] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:27.048 [2024-07-15 09:15:35.814872] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:27.614 09:15:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:27.614 09:15:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:27.614 09:15:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:27.614 09:15:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:27.872 BaseBdev1_malloc 00:11:27.872 09:15:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:28.130 true 00:11:28.130 09:15:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:28.388 [2024-07-15 09:15:37.171944] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:28.388 [2024-07-15 09:15:37.171989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:28.388 [2024-07-15 09:15:37.172010] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25420d0 00:11:28.388 [2024-07-15 09:15:37.172023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:28.388 [2024-07-15 09:15:37.173943] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:28.388 [2024-07-15 09:15:37.173973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:28.388 BaseBdev1 00:11:28.388 09:15:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:28.388 09:15:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:28.646 BaseBdev2_malloc 00:11:28.646 09:15:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:28.904 true 00:11:28.904 09:15:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:29.162 [2024-07-15 09:15:37.907712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:29.162 [2024-07-15 09:15:37.907757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:29.162 [2024-07-15 09:15:37.907779] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2546910 00:11:29.162 [2024-07-15 09:15:37.907798] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:29.162 [2024-07-15 09:15:37.909408] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:29.162 [2024-07-15 09:15:37.909436] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:29.162 BaseBdev2 00:11:29.162 09:15:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:29.420 [2024-07-15 09:15:38.148381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:29.420 [2024-07-15 09:15:38.149762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:29.420 [2024-07-15 09:15:38.149967] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2548320 00:11:29.420 [2024-07-15 09:15:38.149981] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:29.420 [2024-07-15 09:15:38.150185] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2547270 00:11:29.420 [2024-07-15 09:15:38.150337] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2548320 00:11:29.420 [2024-07-15 09:15:38.150348] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2548320 00:11:29.420 [2024-07-15 09:15:38.150456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.420 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:29.678 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.678 "name": "raid_bdev1", 00:11:29.678 "uuid": "c3786af5-9e5f-452d-a698-57d45a603a4f", 00:11:29.678 "strip_size_kb": 64, 00:11:29.678 "state": "online", 00:11:29.678 "raid_level": "raid0", 00:11:29.678 "superblock": true, 00:11:29.678 "num_base_bdevs": 2, 00:11:29.678 "num_base_bdevs_discovered": 2, 00:11:29.678 "num_base_bdevs_operational": 2, 00:11:29.678 "base_bdevs_list": [ 00:11:29.678 { 00:11:29.678 "name": "BaseBdev1", 00:11:29.678 "uuid": "0b2a810a-4749-58f0-b4a2-af156788c78a", 00:11:29.678 "is_configured": true, 00:11:29.678 "data_offset": 2048, 00:11:29.678 "data_size": 63488 00:11:29.678 }, 00:11:29.678 { 00:11:29.678 "name": "BaseBdev2", 00:11:29.678 "uuid": "eb46e673-841c-5a2f-a58a-974316db7689", 00:11:29.678 "is_configured": true, 00:11:29.678 "data_offset": 2048, 00:11:29.678 "data_size": 63488 00:11:29.678 } 00:11:29.678 ] 00:11:29.678 }' 00:11:29.678 09:15:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.678 09:15:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.287 09:15:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:30.287 09:15:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:30.287 [2024-07-15 09:15:39.115236] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25439b0 00:11:31.236 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.493 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.750 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.750 "name": "raid_bdev1", 00:11:31.750 "uuid": "c3786af5-9e5f-452d-a698-57d45a603a4f", 00:11:31.750 "strip_size_kb": 64, 00:11:31.750 "state": "online", 00:11:31.750 "raid_level": "raid0", 00:11:31.750 "superblock": true, 00:11:31.750 "num_base_bdevs": 2, 00:11:31.750 "num_base_bdevs_discovered": 2, 00:11:31.750 "num_base_bdevs_operational": 2, 00:11:31.750 "base_bdevs_list": [ 00:11:31.750 { 00:11:31.750 "name": "BaseBdev1", 00:11:31.750 "uuid": "0b2a810a-4749-58f0-b4a2-af156788c78a", 00:11:31.750 "is_configured": true, 00:11:31.750 "data_offset": 2048, 00:11:31.750 "data_size": 63488 00:11:31.750 }, 00:11:31.750 { 00:11:31.750 "name": "BaseBdev2", 00:11:31.750 "uuid": "eb46e673-841c-5a2f-a58a-974316db7689", 00:11:31.750 "is_configured": true, 00:11:31.750 "data_offset": 2048, 00:11:31.750 "data_size": 63488 00:11:31.750 } 00:11:31.750 ] 00:11:31.750 }' 00:11:31.750 09:15:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.750 09:15:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.315 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:32.573 [2024-07-15 09:15:41.344875] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:32.573 [2024-07-15 09:15:41.344914] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:32.573 [2024-07-15 09:15:41.348102] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:32.573 [2024-07-15 09:15:41.348132] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:32.573 [2024-07-15 09:15:41.348160] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:32.573 [2024-07-15 09:15:41.348172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2548320 name raid_bdev1, state offline 00:11:32.573 0 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 89192 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 89192 ']' 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 89192 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 89192 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 89192' 00:11:32.573 killing process with pid 89192 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 89192 00:11:32.573 [2024-07-15 09:15:41.407626] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:32.573 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 89192 00:11:32.573 [2024-07-15 09:15:41.419728] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.1oXNGg0NZv 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:32.832 00:11:32.832 real 0m6.220s 00:11:32.832 user 0m9.732s 00:11:32.832 sys 0m1.086s 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:32.832 09:15:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.832 ************************************ 00:11:32.832 END TEST raid_write_error_test 00:11:32.832 ************************************ 00:11:32.832 09:15:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:32.832 09:15:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:32.832 09:15:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:32.832 09:15:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:32.832 09:15:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:32.832 09:15:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:32.832 ************************************ 00:11:32.832 START TEST raid_state_function_test 00:11:32.832 ************************************ 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=90166 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 90166' 00:11:32.832 Process raid pid: 90166 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 90166 /var/tmp/spdk-raid.sock 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 90166 ']' 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:32.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:32.832 09:15:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.091 [2024-07-15 09:15:41.814070] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:33.091 [2024-07-15 09:15:41.814138] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:33.091 [2024-07-15 09:15:41.942720] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:33.349 [2024-07-15 09:15:42.044385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:33.349 [2024-07-15 09:15:42.107414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:33.349 [2024-07-15 09:15:42.107446] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:33.913 09:15:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:33.913 09:15:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:33.913 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:34.171 [2024-07-15 09:15:42.966804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:34.171 [2024-07-15 09:15:42.966848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:34.171 [2024-07-15 09:15:42.966859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:34.171 [2024-07-15 09:15:42.966870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.171 09:15:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.438 09:15:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.438 "name": "Existed_Raid", 00:11:34.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.438 "strip_size_kb": 64, 00:11:34.438 "state": "configuring", 00:11:34.438 "raid_level": "concat", 00:11:34.438 "superblock": false, 00:11:34.438 "num_base_bdevs": 2, 00:11:34.438 "num_base_bdevs_discovered": 0, 00:11:34.438 "num_base_bdevs_operational": 2, 00:11:34.438 "base_bdevs_list": [ 00:11:34.438 { 00:11:34.438 "name": "BaseBdev1", 00:11:34.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.438 "is_configured": false, 00:11:34.438 "data_offset": 0, 00:11:34.438 "data_size": 0 00:11:34.438 }, 00:11:34.438 { 00:11:34.438 "name": "BaseBdev2", 00:11:34.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.438 "is_configured": false, 00:11:34.438 "data_offset": 0, 00:11:34.438 "data_size": 0 00:11:34.438 } 00:11:34.438 ] 00:11:34.438 }' 00:11:34.438 09:15:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.438 09:15:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.003 09:15:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:35.261 [2024-07-15 09:15:44.073713] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:35.261 [2024-07-15 09:15:44.073744] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac5a80 name Existed_Raid, state configuring 00:11:35.261 09:15:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:35.519 [2024-07-15 09:15:44.318371] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:35.519 [2024-07-15 09:15:44.318405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:35.519 [2024-07-15 09:15:44.318415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:35.519 [2024-07-15 09:15:44.318426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:35.520 09:15:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:35.778 [2024-07-15 09:15:44.572814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:35.778 BaseBdev1 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:35.778 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:36.036 09:15:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:36.295 [ 00:11:36.295 { 00:11:36.295 "name": "BaseBdev1", 00:11:36.295 "aliases": [ 00:11:36.295 "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d" 00:11:36.295 ], 00:11:36.295 "product_name": "Malloc disk", 00:11:36.295 "block_size": 512, 00:11:36.295 "num_blocks": 65536, 00:11:36.295 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:36.295 "assigned_rate_limits": { 00:11:36.295 "rw_ios_per_sec": 0, 00:11:36.295 "rw_mbytes_per_sec": 0, 00:11:36.295 "r_mbytes_per_sec": 0, 00:11:36.295 "w_mbytes_per_sec": 0 00:11:36.295 }, 00:11:36.295 "claimed": true, 00:11:36.295 "claim_type": "exclusive_write", 00:11:36.295 "zoned": false, 00:11:36.295 "supported_io_types": { 00:11:36.295 "read": true, 00:11:36.295 "write": true, 00:11:36.295 "unmap": true, 00:11:36.295 "flush": true, 00:11:36.295 "reset": true, 00:11:36.295 "nvme_admin": false, 00:11:36.295 "nvme_io": false, 00:11:36.295 "nvme_io_md": false, 00:11:36.295 "write_zeroes": true, 00:11:36.295 "zcopy": true, 00:11:36.295 "get_zone_info": false, 00:11:36.295 "zone_management": false, 00:11:36.295 "zone_append": false, 00:11:36.295 "compare": false, 00:11:36.295 "compare_and_write": false, 00:11:36.295 "abort": true, 00:11:36.295 "seek_hole": false, 00:11:36.295 "seek_data": false, 00:11:36.295 "copy": true, 00:11:36.295 "nvme_iov_md": false 00:11:36.295 }, 00:11:36.295 "memory_domains": [ 00:11:36.295 { 00:11:36.295 "dma_device_id": "system", 00:11:36.295 "dma_device_type": 1 00:11:36.295 }, 00:11:36.295 { 00:11:36.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.295 "dma_device_type": 2 00:11:36.295 } 00:11:36.295 ], 00:11:36.295 "driver_specific": {} 00:11:36.295 } 00:11:36.295 ] 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.295 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.553 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.553 "name": "Existed_Raid", 00:11:36.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.553 "strip_size_kb": 64, 00:11:36.553 "state": "configuring", 00:11:36.553 "raid_level": "concat", 00:11:36.553 "superblock": false, 00:11:36.553 "num_base_bdevs": 2, 00:11:36.553 "num_base_bdevs_discovered": 1, 00:11:36.553 "num_base_bdevs_operational": 2, 00:11:36.553 "base_bdevs_list": [ 00:11:36.553 { 00:11:36.554 "name": "BaseBdev1", 00:11:36.554 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:36.554 "is_configured": true, 00:11:36.554 "data_offset": 0, 00:11:36.554 "data_size": 65536 00:11:36.554 }, 00:11:36.554 { 00:11:36.554 "name": "BaseBdev2", 00:11:36.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.554 "is_configured": false, 00:11:36.554 "data_offset": 0, 00:11:36.554 "data_size": 0 00:11:36.554 } 00:11:36.554 ] 00:11:36.554 }' 00:11:36.554 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.554 09:15:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.120 09:15:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:37.378 [2024-07-15 09:15:46.112896] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:37.378 [2024-07-15 09:15:46.112945] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac5350 name Existed_Raid, state configuring 00:11:37.378 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:37.636 [2024-07-15 09:15:46.357577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:37.636 [2024-07-15 09:15:46.359063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.636 [2024-07-15 09:15:46.359097] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.636 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.895 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.895 "name": "Existed_Raid", 00:11:37.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.895 "strip_size_kb": 64, 00:11:37.895 "state": "configuring", 00:11:37.895 "raid_level": "concat", 00:11:37.895 "superblock": false, 00:11:37.895 "num_base_bdevs": 2, 00:11:37.895 "num_base_bdevs_discovered": 1, 00:11:37.895 "num_base_bdevs_operational": 2, 00:11:37.895 "base_bdevs_list": [ 00:11:37.895 { 00:11:37.895 "name": "BaseBdev1", 00:11:37.895 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:37.895 "is_configured": true, 00:11:37.895 "data_offset": 0, 00:11:37.895 "data_size": 65536 00:11:37.895 }, 00:11:37.895 { 00:11:37.895 "name": "BaseBdev2", 00:11:37.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.895 "is_configured": false, 00:11:37.895 "data_offset": 0, 00:11:37.895 "data_size": 0 00:11:37.895 } 00:11:37.895 ] 00:11:37.895 }' 00:11:37.895 09:15:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.895 09:15:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.461 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:38.719 [2024-07-15 09:15:47.455808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:38.719 [2024-07-15 09:15:47.455843] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ac6000 00:11:38.719 [2024-07-15 09:15:47.455851] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:38.719 [2024-07-15 09:15:47.456043] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19e00c0 00:11:38.719 [2024-07-15 09:15:47.456166] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ac6000 00:11:38.719 [2024-07-15 09:15:47.456176] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ac6000 00:11:38.719 [2024-07-15 09:15:47.456340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:38.719 BaseBdev2 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:38.719 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:38.977 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:39.235 [ 00:11:39.235 { 00:11:39.235 "name": "BaseBdev2", 00:11:39.235 "aliases": [ 00:11:39.235 "d04bfaed-f545-40bb-9c57-d87f1cbb046b" 00:11:39.235 ], 00:11:39.235 "product_name": "Malloc disk", 00:11:39.235 "block_size": 512, 00:11:39.235 "num_blocks": 65536, 00:11:39.235 "uuid": "d04bfaed-f545-40bb-9c57-d87f1cbb046b", 00:11:39.235 "assigned_rate_limits": { 00:11:39.235 "rw_ios_per_sec": 0, 00:11:39.235 "rw_mbytes_per_sec": 0, 00:11:39.235 "r_mbytes_per_sec": 0, 00:11:39.235 "w_mbytes_per_sec": 0 00:11:39.235 }, 00:11:39.235 "claimed": true, 00:11:39.235 "claim_type": "exclusive_write", 00:11:39.235 "zoned": false, 00:11:39.235 "supported_io_types": { 00:11:39.235 "read": true, 00:11:39.235 "write": true, 00:11:39.235 "unmap": true, 00:11:39.235 "flush": true, 00:11:39.235 "reset": true, 00:11:39.235 "nvme_admin": false, 00:11:39.235 "nvme_io": false, 00:11:39.235 "nvme_io_md": false, 00:11:39.235 "write_zeroes": true, 00:11:39.235 "zcopy": true, 00:11:39.235 "get_zone_info": false, 00:11:39.235 "zone_management": false, 00:11:39.235 "zone_append": false, 00:11:39.235 "compare": false, 00:11:39.235 "compare_and_write": false, 00:11:39.235 "abort": true, 00:11:39.235 "seek_hole": false, 00:11:39.235 "seek_data": false, 00:11:39.235 "copy": true, 00:11:39.235 "nvme_iov_md": false 00:11:39.235 }, 00:11:39.235 "memory_domains": [ 00:11:39.235 { 00:11:39.235 "dma_device_id": "system", 00:11:39.235 "dma_device_type": 1 00:11:39.235 }, 00:11:39.235 { 00:11:39.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.235 "dma_device_type": 2 00:11:39.235 } 00:11:39.235 ], 00:11:39.235 "driver_specific": {} 00:11:39.235 } 00:11:39.235 ] 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.235 09:15:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.492 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.492 "name": "Existed_Raid", 00:11:39.492 "uuid": "d9b3ba3b-2106-4bde-8da2-328040229581", 00:11:39.492 "strip_size_kb": 64, 00:11:39.492 "state": "online", 00:11:39.492 "raid_level": "concat", 00:11:39.492 "superblock": false, 00:11:39.492 "num_base_bdevs": 2, 00:11:39.492 "num_base_bdevs_discovered": 2, 00:11:39.492 "num_base_bdevs_operational": 2, 00:11:39.492 "base_bdevs_list": [ 00:11:39.492 { 00:11:39.492 "name": "BaseBdev1", 00:11:39.492 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:39.492 "is_configured": true, 00:11:39.492 "data_offset": 0, 00:11:39.492 "data_size": 65536 00:11:39.492 }, 00:11:39.492 { 00:11:39.492 "name": "BaseBdev2", 00:11:39.492 "uuid": "d04bfaed-f545-40bb-9c57-d87f1cbb046b", 00:11:39.492 "is_configured": true, 00:11:39.492 "data_offset": 0, 00:11:39.492 "data_size": 65536 00:11:39.492 } 00:11:39.492 ] 00:11:39.492 }' 00:11:39.492 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.492 09:15:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:40.056 [2024-07-15 09:15:48.972113] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:40.056 "name": "Existed_Raid", 00:11:40.056 "aliases": [ 00:11:40.056 "d9b3ba3b-2106-4bde-8da2-328040229581" 00:11:40.056 ], 00:11:40.056 "product_name": "Raid Volume", 00:11:40.056 "block_size": 512, 00:11:40.056 "num_blocks": 131072, 00:11:40.056 "uuid": "d9b3ba3b-2106-4bde-8da2-328040229581", 00:11:40.056 "assigned_rate_limits": { 00:11:40.056 "rw_ios_per_sec": 0, 00:11:40.056 "rw_mbytes_per_sec": 0, 00:11:40.056 "r_mbytes_per_sec": 0, 00:11:40.056 "w_mbytes_per_sec": 0 00:11:40.056 }, 00:11:40.056 "claimed": false, 00:11:40.056 "zoned": false, 00:11:40.056 "supported_io_types": { 00:11:40.056 "read": true, 00:11:40.056 "write": true, 00:11:40.056 "unmap": true, 00:11:40.056 "flush": true, 00:11:40.056 "reset": true, 00:11:40.056 "nvme_admin": false, 00:11:40.056 "nvme_io": false, 00:11:40.056 "nvme_io_md": false, 00:11:40.056 "write_zeroes": true, 00:11:40.056 "zcopy": false, 00:11:40.056 "get_zone_info": false, 00:11:40.056 "zone_management": false, 00:11:40.056 "zone_append": false, 00:11:40.056 "compare": false, 00:11:40.056 "compare_and_write": false, 00:11:40.056 "abort": false, 00:11:40.056 "seek_hole": false, 00:11:40.056 "seek_data": false, 00:11:40.056 "copy": false, 00:11:40.056 "nvme_iov_md": false 00:11:40.056 }, 00:11:40.056 "memory_domains": [ 00:11:40.056 { 00:11:40.056 "dma_device_id": "system", 00:11:40.056 "dma_device_type": 1 00:11:40.056 }, 00:11:40.056 { 00:11:40.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.056 "dma_device_type": 2 00:11:40.056 }, 00:11:40.056 { 00:11:40.056 "dma_device_id": "system", 00:11:40.056 "dma_device_type": 1 00:11:40.056 }, 00:11:40.056 { 00:11:40.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.056 "dma_device_type": 2 00:11:40.056 } 00:11:40.056 ], 00:11:40.056 "driver_specific": { 00:11:40.056 "raid": { 00:11:40.056 "uuid": "d9b3ba3b-2106-4bde-8da2-328040229581", 00:11:40.056 "strip_size_kb": 64, 00:11:40.056 "state": "online", 00:11:40.056 "raid_level": "concat", 00:11:40.056 "superblock": false, 00:11:40.056 "num_base_bdevs": 2, 00:11:40.056 "num_base_bdevs_discovered": 2, 00:11:40.056 "num_base_bdevs_operational": 2, 00:11:40.056 "base_bdevs_list": [ 00:11:40.056 { 00:11:40.056 "name": "BaseBdev1", 00:11:40.056 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:40.056 "is_configured": true, 00:11:40.056 "data_offset": 0, 00:11:40.056 "data_size": 65536 00:11:40.056 }, 00:11:40.056 { 00:11:40.056 "name": "BaseBdev2", 00:11:40.056 "uuid": "d04bfaed-f545-40bb-9c57-d87f1cbb046b", 00:11:40.056 "is_configured": true, 00:11:40.056 "data_offset": 0, 00:11:40.056 "data_size": 65536 00:11:40.056 } 00:11:40.056 ] 00:11:40.056 } 00:11:40.056 } 00:11:40.056 }' 00:11:40.056 09:15:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:40.314 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:40.314 BaseBdev2' 00:11:40.314 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:40.314 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:40.314 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:40.572 "name": "BaseBdev1", 00:11:40.572 "aliases": [ 00:11:40.572 "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d" 00:11:40.572 ], 00:11:40.572 "product_name": "Malloc disk", 00:11:40.572 "block_size": 512, 00:11:40.572 "num_blocks": 65536, 00:11:40.572 "uuid": "e3ff8c78-6c13-49c7-9e37-fcca0099ae0d", 00:11:40.572 "assigned_rate_limits": { 00:11:40.572 "rw_ios_per_sec": 0, 00:11:40.572 "rw_mbytes_per_sec": 0, 00:11:40.572 "r_mbytes_per_sec": 0, 00:11:40.572 "w_mbytes_per_sec": 0 00:11:40.572 }, 00:11:40.572 "claimed": true, 00:11:40.572 "claim_type": "exclusive_write", 00:11:40.572 "zoned": false, 00:11:40.572 "supported_io_types": { 00:11:40.572 "read": true, 00:11:40.572 "write": true, 00:11:40.572 "unmap": true, 00:11:40.572 "flush": true, 00:11:40.572 "reset": true, 00:11:40.572 "nvme_admin": false, 00:11:40.572 "nvme_io": false, 00:11:40.572 "nvme_io_md": false, 00:11:40.572 "write_zeroes": true, 00:11:40.572 "zcopy": true, 00:11:40.572 "get_zone_info": false, 00:11:40.572 "zone_management": false, 00:11:40.572 "zone_append": false, 00:11:40.572 "compare": false, 00:11:40.572 "compare_and_write": false, 00:11:40.572 "abort": true, 00:11:40.572 "seek_hole": false, 00:11:40.572 "seek_data": false, 00:11:40.572 "copy": true, 00:11:40.572 "nvme_iov_md": false 00:11:40.572 }, 00:11:40.572 "memory_domains": [ 00:11:40.572 { 00:11:40.572 "dma_device_id": "system", 00:11:40.572 "dma_device_type": 1 00:11:40.572 }, 00:11:40.572 { 00:11:40.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.572 "dma_device_type": 2 00:11:40.572 } 00:11:40.572 ], 00:11:40.572 "driver_specific": {} 00:11:40.572 }' 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.572 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:40.830 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:41.088 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:41.088 "name": "BaseBdev2", 00:11:41.088 "aliases": [ 00:11:41.088 "d04bfaed-f545-40bb-9c57-d87f1cbb046b" 00:11:41.088 ], 00:11:41.088 "product_name": "Malloc disk", 00:11:41.088 "block_size": 512, 00:11:41.088 "num_blocks": 65536, 00:11:41.088 "uuid": "d04bfaed-f545-40bb-9c57-d87f1cbb046b", 00:11:41.088 "assigned_rate_limits": { 00:11:41.088 "rw_ios_per_sec": 0, 00:11:41.088 "rw_mbytes_per_sec": 0, 00:11:41.088 "r_mbytes_per_sec": 0, 00:11:41.088 "w_mbytes_per_sec": 0 00:11:41.088 }, 00:11:41.088 "claimed": true, 00:11:41.088 "claim_type": "exclusive_write", 00:11:41.088 "zoned": false, 00:11:41.088 "supported_io_types": { 00:11:41.088 "read": true, 00:11:41.089 "write": true, 00:11:41.089 "unmap": true, 00:11:41.089 "flush": true, 00:11:41.089 "reset": true, 00:11:41.089 "nvme_admin": false, 00:11:41.089 "nvme_io": false, 00:11:41.089 "nvme_io_md": false, 00:11:41.089 "write_zeroes": true, 00:11:41.089 "zcopy": true, 00:11:41.089 "get_zone_info": false, 00:11:41.089 "zone_management": false, 00:11:41.089 "zone_append": false, 00:11:41.089 "compare": false, 00:11:41.089 "compare_and_write": false, 00:11:41.089 "abort": true, 00:11:41.089 "seek_hole": false, 00:11:41.089 "seek_data": false, 00:11:41.089 "copy": true, 00:11:41.089 "nvme_iov_md": false 00:11:41.089 }, 00:11:41.089 "memory_domains": [ 00:11:41.089 { 00:11:41.089 "dma_device_id": "system", 00:11:41.089 "dma_device_type": 1 00:11:41.089 }, 00:11:41.089 { 00:11:41.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.089 "dma_device_type": 2 00:11:41.089 } 00:11:41.089 ], 00:11:41.089 "driver_specific": {} 00:11:41.089 }' 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:41.089 09:15:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:41.346 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:41.604 [2024-07-15 09:15:50.343536] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:41.604 [2024-07-15 09:15:50.343561] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:41.604 [2024-07-15 09:15:50.343601] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.604 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.604 "name": "Existed_Raid", 00:11:41.604 "uuid": "d9b3ba3b-2106-4bde-8da2-328040229581", 00:11:41.604 "strip_size_kb": 64, 00:11:41.604 "state": "offline", 00:11:41.605 "raid_level": "concat", 00:11:41.605 "superblock": false, 00:11:41.605 "num_base_bdevs": 2, 00:11:41.605 "num_base_bdevs_discovered": 1, 00:11:41.605 "num_base_bdevs_operational": 1, 00:11:41.605 "base_bdevs_list": [ 00:11:41.605 { 00:11:41.605 "name": null, 00:11:41.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.605 "is_configured": false, 00:11:41.605 "data_offset": 0, 00:11:41.605 "data_size": 65536 00:11:41.605 }, 00:11:41.605 { 00:11:41.605 "name": "BaseBdev2", 00:11:41.605 "uuid": "d04bfaed-f545-40bb-9c57-d87f1cbb046b", 00:11:41.605 "is_configured": true, 00:11:41.605 "data_offset": 0, 00:11:41.605 "data_size": 65536 00:11:41.605 } 00:11:41.605 ] 00:11:41.605 }' 00:11:41.605 09:15:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.605 09:15:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.170 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:42.170 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:42.170 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.170 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:42.428 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:42.428 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:42.428 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:42.994 [2024-07-15 09:15:51.844543] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:42.994 [2024-07-15 09:15:51.844594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac6000 name Existed_Raid, state offline 00:11:42.994 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:42.994 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:42.994 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.994 09:15:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 90166 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 90166 ']' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 90166 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 90166 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 90166' 00:11:43.253 killing process with pid 90166 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 90166 00:11:43.253 [2024-07-15 09:15:52.163202] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:43.253 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 90166 00:11:43.253 [2024-07-15 09:15:52.164147] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:43.512 09:15:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:43.512 00:11:43.512 real 0m10.645s 00:11:43.512 user 0m18.970s 00:11:43.512 sys 0m1.925s 00:11:43.512 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:43.512 09:15:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.512 ************************************ 00:11:43.512 END TEST raid_state_function_test 00:11:43.512 ************************************ 00:11:43.512 09:15:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:43.512 09:15:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:43.512 09:15:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:43.512 09:15:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:43.512 09:15:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:43.833 ************************************ 00:11:43.833 START TEST raid_state_function_test_sb 00:11:43.833 ************************************ 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=91820 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 91820' 00:11:43.833 Process raid pid: 91820 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 91820 /var/tmp/spdk-raid.sock 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 91820 ']' 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:43.833 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:43.834 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:43.834 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:43.834 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:43.834 09:15:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:43.834 09:15:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:43.834 [2024-07-15 09:15:52.528313] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:43.834 [2024-07-15 09:15:52.528380] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:43.834 [2024-07-15 09:15:52.659237] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:44.093 [2024-07-15 09:15:52.771749] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:44.093 [2024-07-15 09:15:52.840545] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.093 [2024-07-15 09:15:52.840607] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.660 09:15:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:44.660 09:15:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:44.660 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:44.918 [2024-07-15 09:15:53.671995] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:44.918 [2024-07-15 09:15:53.672048] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:44.918 [2024-07-15 09:15:53.672059] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:44.918 [2024-07-15 09:15:53.672071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.918 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.177 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.177 "name": "Existed_Raid", 00:11:45.177 "uuid": "a5b5246b-9c14-401c-a47e-1e6aca20864e", 00:11:45.177 "strip_size_kb": 64, 00:11:45.177 "state": "configuring", 00:11:45.177 "raid_level": "concat", 00:11:45.177 "superblock": true, 00:11:45.177 "num_base_bdevs": 2, 00:11:45.177 "num_base_bdevs_discovered": 0, 00:11:45.177 "num_base_bdevs_operational": 2, 00:11:45.177 "base_bdevs_list": [ 00:11:45.177 { 00:11:45.177 "name": "BaseBdev1", 00:11:45.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.177 "is_configured": false, 00:11:45.177 "data_offset": 0, 00:11:45.177 "data_size": 0 00:11:45.177 }, 00:11:45.177 { 00:11:45.177 "name": "BaseBdev2", 00:11:45.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.177 "is_configured": false, 00:11:45.177 "data_offset": 0, 00:11:45.177 "data_size": 0 00:11:45.177 } 00:11:45.177 ] 00:11:45.177 }' 00:11:45.177 09:15:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.177 09:15:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.111 09:15:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:46.111 [2024-07-15 09:15:54.955239] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:46.111 [2024-07-15 09:15:54.955272] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2119a80 name Existed_Raid, state configuring 00:11:46.111 09:15:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:46.370 [2024-07-15 09:15:55.127716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.370 [2024-07-15 09:15:55.127744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.370 [2024-07-15 09:15:55.127754] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.370 [2024-07-15 09:15:55.127765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.370 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:46.628 [2024-07-15 09:15:55.390312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:46.628 BaseBdev1 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:46.628 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:46.888 [ 00:11:46.888 { 00:11:46.888 "name": "BaseBdev1", 00:11:46.888 "aliases": [ 00:11:46.888 "21067eb0-d74e-45eb-8535-a386ae51ddd1" 00:11:46.888 ], 00:11:46.888 "product_name": "Malloc disk", 00:11:46.888 "block_size": 512, 00:11:46.888 "num_blocks": 65536, 00:11:46.888 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:46.888 "assigned_rate_limits": { 00:11:46.888 "rw_ios_per_sec": 0, 00:11:46.888 "rw_mbytes_per_sec": 0, 00:11:46.888 "r_mbytes_per_sec": 0, 00:11:46.888 "w_mbytes_per_sec": 0 00:11:46.888 }, 00:11:46.888 "claimed": true, 00:11:46.888 "claim_type": "exclusive_write", 00:11:46.888 "zoned": false, 00:11:46.888 "supported_io_types": { 00:11:46.888 "read": true, 00:11:46.888 "write": true, 00:11:46.888 "unmap": true, 00:11:46.888 "flush": true, 00:11:46.888 "reset": true, 00:11:46.888 "nvme_admin": false, 00:11:46.888 "nvme_io": false, 00:11:46.888 "nvme_io_md": false, 00:11:46.888 "write_zeroes": true, 00:11:46.888 "zcopy": true, 00:11:46.888 "get_zone_info": false, 00:11:46.888 "zone_management": false, 00:11:46.888 "zone_append": false, 00:11:46.888 "compare": false, 00:11:46.888 "compare_and_write": false, 00:11:46.888 "abort": true, 00:11:46.888 "seek_hole": false, 00:11:46.888 "seek_data": false, 00:11:46.888 "copy": true, 00:11:46.888 "nvme_iov_md": false 00:11:46.888 }, 00:11:46.888 "memory_domains": [ 00:11:46.888 { 00:11:46.888 "dma_device_id": "system", 00:11:46.888 "dma_device_type": 1 00:11:46.888 }, 00:11:46.888 { 00:11:46.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.888 "dma_device_type": 2 00:11:46.888 } 00:11:46.888 ], 00:11:46.888 "driver_specific": {} 00:11:46.888 } 00:11:46.888 ] 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.888 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.147 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.147 "name": "Existed_Raid", 00:11:47.147 "uuid": "e7157ae3-564f-4e95-8cd7-e3bbdf723ead", 00:11:47.147 "strip_size_kb": 64, 00:11:47.147 "state": "configuring", 00:11:47.147 "raid_level": "concat", 00:11:47.147 "superblock": true, 00:11:47.147 "num_base_bdevs": 2, 00:11:47.147 "num_base_bdevs_discovered": 1, 00:11:47.147 "num_base_bdevs_operational": 2, 00:11:47.147 "base_bdevs_list": [ 00:11:47.147 { 00:11:47.147 "name": "BaseBdev1", 00:11:47.147 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:47.147 "is_configured": true, 00:11:47.147 "data_offset": 2048, 00:11:47.147 "data_size": 63488 00:11:47.147 }, 00:11:47.147 { 00:11:47.147 "name": "BaseBdev2", 00:11:47.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.147 "is_configured": false, 00:11:47.147 "data_offset": 0, 00:11:47.147 "data_size": 0 00:11:47.147 } 00:11:47.147 ] 00:11:47.147 }' 00:11:47.147 09:15:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.147 09:15:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.714 09:15:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:47.973 [2024-07-15 09:15:56.765978] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:47.974 [2024-07-15 09:15:56.766020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2119350 name Existed_Raid, state configuring 00:11:47.974 09:15:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.233 [2024-07-15 09:15:57.006654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.233 [2024-07-15 09:15:57.008193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.233 [2024-07-15 09:15:57.008226] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.233 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.234 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.234 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.493 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.493 "name": "Existed_Raid", 00:11:48.493 "uuid": "9d464c19-b7dc-46c8-9fec-d33e42c5f194", 00:11:48.493 "strip_size_kb": 64, 00:11:48.493 "state": "configuring", 00:11:48.493 "raid_level": "concat", 00:11:48.493 "superblock": true, 00:11:48.493 "num_base_bdevs": 2, 00:11:48.493 "num_base_bdevs_discovered": 1, 00:11:48.493 "num_base_bdevs_operational": 2, 00:11:48.493 "base_bdevs_list": [ 00:11:48.493 { 00:11:48.493 "name": "BaseBdev1", 00:11:48.493 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:48.493 "is_configured": true, 00:11:48.493 "data_offset": 2048, 00:11:48.493 "data_size": 63488 00:11:48.493 }, 00:11:48.493 { 00:11:48.493 "name": "BaseBdev2", 00:11:48.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.493 "is_configured": false, 00:11:48.493 "data_offset": 0, 00:11:48.493 "data_size": 0 00:11:48.493 } 00:11:48.493 ] 00:11:48.493 }' 00:11:48.493 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.493 09:15:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.061 09:15:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:49.321 [2024-07-15 09:15:58.028648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:49.321 [2024-07-15 09:15:58.028793] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x211a000 00:11:49.321 [2024-07-15 09:15:58.028806] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:49.321 [2024-07-15 09:15:58.028996] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20340c0 00:11:49.321 [2024-07-15 09:15:58.029113] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x211a000 00:11:49.321 [2024-07-15 09:15:58.029124] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x211a000 00:11:49.321 [2024-07-15 09:15:58.029217] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:49.321 BaseBdev2 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:49.321 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:49.580 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:49.580 [ 00:11:49.580 { 00:11:49.580 "name": "BaseBdev2", 00:11:49.580 "aliases": [ 00:11:49.580 "c140af3d-45c6-4389-b989-d99e276d2a16" 00:11:49.580 ], 00:11:49.580 "product_name": "Malloc disk", 00:11:49.580 "block_size": 512, 00:11:49.580 "num_blocks": 65536, 00:11:49.580 "uuid": "c140af3d-45c6-4389-b989-d99e276d2a16", 00:11:49.580 "assigned_rate_limits": { 00:11:49.580 "rw_ios_per_sec": 0, 00:11:49.580 "rw_mbytes_per_sec": 0, 00:11:49.580 "r_mbytes_per_sec": 0, 00:11:49.580 "w_mbytes_per_sec": 0 00:11:49.580 }, 00:11:49.580 "claimed": true, 00:11:49.580 "claim_type": "exclusive_write", 00:11:49.580 "zoned": false, 00:11:49.580 "supported_io_types": { 00:11:49.580 "read": true, 00:11:49.580 "write": true, 00:11:49.580 "unmap": true, 00:11:49.580 "flush": true, 00:11:49.580 "reset": true, 00:11:49.580 "nvme_admin": false, 00:11:49.580 "nvme_io": false, 00:11:49.580 "nvme_io_md": false, 00:11:49.580 "write_zeroes": true, 00:11:49.580 "zcopy": true, 00:11:49.580 "get_zone_info": false, 00:11:49.580 "zone_management": false, 00:11:49.580 "zone_append": false, 00:11:49.580 "compare": false, 00:11:49.580 "compare_and_write": false, 00:11:49.580 "abort": true, 00:11:49.580 "seek_hole": false, 00:11:49.580 "seek_data": false, 00:11:49.580 "copy": true, 00:11:49.580 "nvme_iov_md": false 00:11:49.580 }, 00:11:49.580 "memory_domains": [ 00:11:49.580 { 00:11:49.580 "dma_device_id": "system", 00:11:49.580 "dma_device_type": 1 00:11:49.580 }, 00:11:49.580 { 00:11:49.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.580 "dma_device_type": 2 00:11:49.580 } 00:11:49.580 ], 00:11:49.580 "driver_specific": {} 00:11:49.580 } 00:11:49.580 ] 00:11:49.838 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:49.838 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:49.838 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:49.838 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:49.838 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.839 "name": "Existed_Raid", 00:11:49.839 "uuid": "9d464c19-b7dc-46c8-9fec-d33e42c5f194", 00:11:49.839 "strip_size_kb": 64, 00:11:49.839 "state": "online", 00:11:49.839 "raid_level": "concat", 00:11:49.839 "superblock": true, 00:11:49.839 "num_base_bdevs": 2, 00:11:49.839 "num_base_bdevs_discovered": 2, 00:11:49.839 "num_base_bdevs_operational": 2, 00:11:49.839 "base_bdevs_list": [ 00:11:49.839 { 00:11:49.839 "name": "BaseBdev1", 00:11:49.839 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:49.839 "is_configured": true, 00:11:49.839 "data_offset": 2048, 00:11:49.839 "data_size": 63488 00:11:49.839 }, 00:11:49.839 { 00:11:49.839 "name": "BaseBdev2", 00:11:49.839 "uuid": "c140af3d-45c6-4389-b989-d99e276d2a16", 00:11:49.839 "is_configured": true, 00:11:49.839 "data_offset": 2048, 00:11:49.839 "data_size": 63488 00:11:49.839 } 00:11:49.839 ] 00:11:49.839 }' 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.839 09:15:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.775 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:50.775 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:50.775 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:50.775 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:50.776 [2024-07-15 09:15:59.621171] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:50.776 "name": "Existed_Raid", 00:11:50.776 "aliases": [ 00:11:50.776 "9d464c19-b7dc-46c8-9fec-d33e42c5f194" 00:11:50.776 ], 00:11:50.776 "product_name": "Raid Volume", 00:11:50.776 "block_size": 512, 00:11:50.776 "num_blocks": 126976, 00:11:50.776 "uuid": "9d464c19-b7dc-46c8-9fec-d33e42c5f194", 00:11:50.776 "assigned_rate_limits": { 00:11:50.776 "rw_ios_per_sec": 0, 00:11:50.776 "rw_mbytes_per_sec": 0, 00:11:50.776 "r_mbytes_per_sec": 0, 00:11:50.776 "w_mbytes_per_sec": 0 00:11:50.776 }, 00:11:50.776 "claimed": false, 00:11:50.776 "zoned": false, 00:11:50.776 "supported_io_types": { 00:11:50.776 "read": true, 00:11:50.776 "write": true, 00:11:50.776 "unmap": true, 00:11:50.776 "flush": true, 00:11:50.776 "reset": true, 00:11:50.776 "nvme_admin": false, 00:11:50.776 "nvme_io": false, 00:11:50.776 "nvme_io_md": false, 00:11:50.776 "write_zeroes": true, 00:11:50.776 "zcopy": false, 00:11:50.776 "get_zone_info": false, 00:11:50.776 "zone_management": false, 00:11:50.776 "zone_append": false, 00:11:50.776 "compare": false, 00:11:50.776 "compare_and_write": false, 00:11:50.776 "abort": false, 00:11:50.776 "seek_hole": false, 00:11:50.776 "seek_data": false, 00:11:50.776 "copy": false, 00:11:50.776 "nvme_iov_md": false 00:11:50.776 }, 00:11:50.776 "memory_domains": [ 00:11:50.776 { 00:11:50.776 "dma_device_id": "system", 00:11:50.776 "dma_device_type": 1 00:11:50.776 }, 00:11:50.776 { 00:11:50.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.776 "dma_device_type": 2 00:11:50.776 }, 00:11:50.776 { 00:11:50.776 "dma_device_id": "system", 00:11:50.776 "dma_device_type": 1 00:11:50.776 }, 00:11:50.776 { 00:11:50.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.776 "dma_device_type": 2 00:11:50.776 } 00:11:50.776 ], 00:11:50.776 "driver_specific": { 00:11:50.776 "raid": { 00:11:50.776 "uuid": "9d464c19-b7dc-46c8-9fec-d33e42c5f194", 00:11:50.776 "strip_size_kb": 64, 00:11:50.776 "state": "online", 00:11:50.776 "raid_level": "concat", 00:11:50.776 "superblock": true, 00:11:50.776 "num_base_bdevs": 2, 00:11:50.776 "num_base_bdevs_discovered": 2, 00:11:50.776 "num_base_bdevs_operational": 2, 00:11:50.776 "base_bdevs_list": [ 00:11:50.776 { 00:11:50.776 "name": "BaseBdev1", 00:11:50.776 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:50.776 "is_configured": true, 00:11:50.776 "data_offset": 2048, 00:11:50.776 "data_size": 63488 00:11:50.776 }, 00:11:50.776 { 00:11:50.776 "name": "BaseBdev2", 00:11:50.776 "uuid": "c140af3d-45c6-4389-b989-d99e276d2a16", 00:11:50.776 "is_configured": true, 00:11:50.776 "data_offset": 2048, 00:11:50.776 "data_size": 63488 00:11:50.776 } 00:11:50.776 ] 00:11:50.776 } 00:11:50.776 } 00:11:50.776 }' 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:50.776 BaseBdev2' 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:50.776 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.035 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.035 "name": "BaseBdev1", 00:11:51.035 "aliases": [ 00:11:51.035 "21067eb0-d74e-45eb-8535-a386ae51ddd1" 00:11:51.035 ], 00:11:51.035 "product_name": "Malloc disk", 00:11:51.035 "block_size": 512, 00:11:51.035 "num_blocks": 65536, 00:11:51.035 "uuid": "21067eb0-d74e-45eb-8535-a386ae51ddd1", 00:11:51.035 "assigned_rate_limits": { 00:11:51.035 "rw_ios_per_sec": 0, 00:11:51.035 "rw_mbytes_per_sec": 0, 00:11:51.035 "r_mbytes_per_sec": 0, 00:11:51.035 "w_mbytes_per_sec": 0 00:11:51.035 }, 00:11:51.035 "claimed": true, 00:11:51.035 "claim_type": "exclusive_write", 00:11:51.035 "zoned": false, 00:11:51.035 "supported_io_types": { 00:11:51.035 "read": true, 00:11:51.035 "write": true, 00:11:51.035 "unmap": true, 00:11:51.035 "flush": true, 00:11:51.035 "reset": true, 00:11:51.035 "nvme_admin": false, 00:11:51.035 "nvme_io": false, 00:11:51.035 "nvme_io_md": false, 00:11:51.035 "write_zeroes": true, 00:11:51.035 "zcopy": true, 00:11:51.035 "get_zone_info": false, 00:11:51.035 "zone_management": false, 00:11:51.035 "zone_append": false, 00:11:51.035 "compare": false, 00:11:51.035 "compare_and_write": false, 00:11:51.035 "abort": true, 00:11:51.035 "seek_hole": false, 00:11:51.035 "seek_data": false, 00:11:51.035 "copy": true, 00:11:51.035 "nvme_iov_md": false 00:11:51.035 }, 00:11:51.035 "memory_domains": [ 00:11:51.035 { 00:11:51.035 "dma_device_id": "system", 00:11:51.035 "dma_device_type": 1 00:11:51.035 }, 00:11:51.035 { 00:11:51.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.035 "dma_device_type": 2 00:11:51.035 } 00:11:51.035 ], 00:11:51.035 "driver_specific": {} 00:11:51.035 }' 00:11:51.035 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.035 09:15:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.295 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.553 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.553 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.553 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.553 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:51.553 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.812 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.812 "name": "BaseBdev2", 00:11:51.812 "aliases": [ 00:11:51.812 "c140af3d-45c6-4389-b989-d99e276d2a16" 00:11:51.812 ], 00:11:51.812 "product_name": "Malloc disk", 00:11:51.812 "block_size": 512, 00:11:51.812 "num_blocks": 65536, 00:11:51.812 "uuid": "c140af3d-45c6-4389-b989-d99e276d2a16", 00:11:51.812 "assigned_rate_limits": { 00:11:51.812 "rw_ios_per_sec": 0, 00:11:51.812 "rw_mbytes_per_sec": 0, 00:11:51.812 "r_mbytes_per_sec": 0, 00:11:51.812 "w_mbytes_per_sec": 0 00:11:51.812 }, 00:11:51.812 "claimed": true, 00:11:51.812 "claim_type": "exclusive_write", 00:11:51.812 "zoned": false, 00:11:51.812 "supported_io_types": { 00:11:51.812 "read": true, 00:11:51.812 "write": true, 00:11:51.812 "unmap": true, 00:11:51.812 "flush": true, 00:11:51.812 "reset": true, 00:11:51.812 "nvme_admin": false, 00:11:51.812 "nvme_io": false, 00:11:51.812 "nvme_io_md": false, 00:11:51.812 "write_zeroes": true, 00:11:51.812 "zcopy": true, 00:11:51.812 "get_zone_info": false, 00:11:51.812 "zone_management": false, 00:11:51.812 "zone_append": false, 00:11:51.812 "compare": false, 00:11:51.812 "compare_and_write": false, 00:11:51.812 "abort": true, 00:11:51.812 "seek_hole": false, 00:11:51.812 "seek_data": false, 00:11:51.812 "copy": true, 00:11:51.812 "nvme_iov_md": false 00:11:51.812 }, 00:11:51.812 "memory_domains": [ 00:11:51.812 { 00:11:51.812 "dma_device_id": "system", 00:11:51.812 "dma_device_type": 1 00:11:51.812 }, 00:11:51.812 { 00:11:51.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.812 "dma_device_type": 2 00:11:51.812 } 00:11:51.813 ], 00:11:51.813 "driver_specific": {} 00:11:51.813 }' 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.813 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.072 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.072 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.072 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.072 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.072 09:16:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:52.330 [2024-07-15 09:16:01.120938] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:52.330 [2024-07-15 09:16:01.120967] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:52.330 [2024-07-15 09:16:01.121008] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.330 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.588 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.588 "name": "Existed_Raid", 00:11:52.588 "uuid": "9d464c19-b7dc-46c8-9fec-d33e42c5f194", 00:11:52.588 "strip_size_kb": 64, 00:11:52.588 "state": "offline", 00:11:52.588 "raid_level": "concat", 00:11:52.588 "superblock": true, 00:11:52.588 "num_base_bdevs": 2, 00:11:52.588 "num_base_bdevs_discovered": 1, 00:11:52.588 "num_base_bdevs_operational": 1, 00:11:52.588 "base_bdevs_list": [ 00:11:52.588 { 00:11:52.588 "name": null, 00:11:52.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.588 "is_configured": false, 00:11:52.588 "data_offset": 2048, 00:11:52.588 "data_size": 63488 00:11:52.588 }, 00:11:52.588 { 00:11:52.588 "name": "BaseBdev2", 00:11:52.588 "uuid": "c140af3d-45c6-4389-b989-d99e276d2a16", 00:11:52.588 "is_configured": true, 00:11:52.588 "data_offset": 2048, 00:11:52.588 "data_size": 63488 00:11:52.588 } 00:11:52.588 ] 00:11:52.588 }' 00:11:52.588 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.588 09:16:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.153 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:53.153 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.153 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.153 09:16:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:53.411 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:53.411 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:53.411 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:53.977 [2024-07-15 09:16:02.714174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:53.977 [2024-07-15 09:16:02.714222] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211a000 name Existed_Raid, state offline 00:11:53.977 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:53.977 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.978 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.978 09:16:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 91820 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 91820 ']' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 91820 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 91820 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 91820' 00:11:54.237 killing process with pid 91820 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 91820 00:11:54.237 [2024-07-15 09:16:03.050059] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.237 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 91820 00:11:54.237 [2024-07-15 09:16:03.050997] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:54.496 09:16:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:54.496 00:11:54.496 real 0m10.806s 00:11:54.496 user 0m19.245s 00:11:54.496 sys 0m1.957s 00:11:54.496 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:54.496 09:16:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.496 ************************************ 00:11:54.496 END TEST raid_state_function_test_sb 00:11:54.496 ************************************ 00:11:54.496 09:16:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:54.496 09:16:03 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:54.496 09:16:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:54.496 09:16:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:54.496 09:16:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:54.496 ************************************ 00:11:54.496 START TEST raid_superblock_test 00:11:54.496 ************************************ 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=93456 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 93456 /var/tmp/spdk-raid.sock 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:54.496 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 93456 ']' 00:11:54.497 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:54.497 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:54.497 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:54.497 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:54.497 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:54.497 09:16:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.497 [2024-07-15 09:16:03.425021] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:11:54.497 [2024-07-15 09:16:03.425085] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93456 ] 00:11:54.755 [2024-07-15 09:16:03.552712] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:54.755 [2024-07-15 09:16:03.658587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:55.011 [2024-07-15 09:16:03.729803] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.011 [2024-07-15 09:16:03.729841] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:55.580 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:55.837 malloc1 00:11:55.837 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:56.094 [2024-07-15 09:16:04.844617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:56.094 [2024-07-15 09:16:04.844665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.094 [2024-07-15 09:16:04.844685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x838570 00:11:56.094 [2024-07-15 09:16:04.844698] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.094 [2024-07-15 09:16:04.846434] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.094 [2024-07-15 09:16:04.846462] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:56.094 pt1 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:56.094 09:16:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:56.351 malloc2 00:11:56.351 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:56.608 [2024-07-15 09:16:05.347945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:56.608 [2024-07-15 09:16:05.347991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.608 [2024-07-15 09:16:05.348009] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x839970 00:11:56.608 [2024-07-15 09:16:05.348022] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.608 [2024-07-15 09:16:05.349662] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.608 [2024-07-15 09:16:05.349690] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:56.608 pt2 00:11:56.608 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:56.608 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.608 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:56.865 [2024-07-15 09:16:05.588601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:56.865 [2024-07-15 09:16:05.589975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:56.865 [2024-07-15 09:16:05.590122] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9dc270 00:11:56.865 [2024-07-15 09:16:05.590136] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:56.865 [2024-07-15 09:16:05.590330] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9d1c10 00:11:56.865 [2024-07-15 09:16:05.590477] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9dc270 00:11:56.866 [2024-07-15 09:16:05.590487] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9dc270 00:11:56.866 [2024-07-15 09:16:05.590587] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.866 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.122 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.122 "name": "raid_bdev1", 00:11:57.122 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:11:57.123 "strip_size_kb": 64, 00:11:57.123 "state": "online", 00:11:57.123 "raid_level": "concat", 00:11:57.123 "superblock": true, 00:11:57.123 "num_base_bdevs": 2, 00:11:57.123 "num_base_bdevs_discovered": 2, 00:11:57.123 "num_base_bdevs_operational": 2, 00:11:57.123 "base_bdevs_list": [ 00:11:57.123 { 00:11:57.123 "name": "pt1", 00:11:57.123 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:57.123 "is_configured": true, 00:11:57.123 "data_offset": 2048, 00:11:57.123 "data_size": 63488 00:11:57.123 }, 00:11:57.123 { 00:11:57.123 "name": "pt2", 00:11:57.123 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.123 "is_configured": true, 00:11:57.123 "data_offset": 2048, 00:11:57.123 "data_size": 63488 00:11:57.123 } 00:11:57.123 ] 00:11:57.123 }' 00:11:57.123 09:16:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.123 09:16:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:57.687 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:57.944 [2024-07-15 09:16:06.683735] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:57.944 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:57.944 "name": "raid_bdev1", 00:11:57.944 "aliases": [ 00:11:57.945 "d9352c5c-5817-4216-8c08-214c2fb1b6e4" 00:11:57.945 ], 00:11:57.945 "product_name": "Raid Volume", 00:11:57.945 "block_size": 512, 00:11:57.945 "num_blocks": 126976, 00:11:57.945 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:11:57.945 "assigned_rate_limits": { 00:11:57.945 "rw_ios_per_sec": 0, 00:11:57.945 "rw_mbytes_per_sec": 0, 00:11:57.945 "r_mbytes_per_sec": 0, 00:11:57.945 "w_mbytes_per_sec": 0 00:11:57.945 }, 00:11:57.945 "claimed": false, 00:11:57.945 "zoned": false, 00:11:57.945 "supported_io_types": { 00:11:57.945 "read": true, 00:11:57.945 "write": true, 00:11:57.945 "unmap": true, 00:11:57.945 "flush": true, 00:11:57.945 "reset": true, 00:11:57.945 "nvme_admin": false, 00:11:57.945 "nvme_io": false, 00:11:57.945 "nvme_io_md": false, 00:11:57.945 "write_zeroes": true, 00:11:57.945 "zcopy": false, 00:11:57.945 "get_zone_info": false, 00:11:57.945 "zone_management": false, 00:11:57.945 "zone_append": false, 00:11:57.945 "compare": false, 00:11:57.945 "compare_and_write": false, 00:11:57.945 "abort": false, 00:11:57.945 "seek_hole": false, 00:11:57.945 "seek_data": false, 00:11:57.945 "copy": false, 00:11:57.945 "nvme_iov_md": false 00:11:57.945 }, 00:11:57.945 "memory_domains": [ 00:11:57.945 { 00:11:57.945 "dma_device_id": "system", 00:11:57.945 "dma_device_type": 1 00:11:57.945 }, 00:11:57.945 { 00:11:57.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.945 "dma_device_type": 2 00:11:57.945 }, 00:11:57.945 { 00:11:57.945 "dma_device_id": "system", 00:11:57.945 "dma_device_type": 1 00:11:57.945 }, 00:11:57.945 { 00:11:57.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.945 "dma_device_type": 2 00:11:57.945 } 00:11:57.945 ], 00:11:57.945 "driver_specific": { 00:11:57.945 "raid": { 00:11:57.945 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:11:57.945 "strip_size_kb": 64, 00:11:57.945 "state": "online", 00:11:57.945 "raid_level": "concat", 00:11:57.945 "superblock": true, 00:11:57.945 "num_base_bdevs": 2, 00:11:57.945 "num_base_bdevs_discovered": 2, 00:11:57.945 "num_base_bdevs_operational": 2, 00:11:57.945 "base_bdevs_list": [ 00:11:57.945 { 00:11:57.945 "name": "pt1", 00:11:57.945 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:57.945 "is_configured": true, 00:11:57.945 "data_offset": 2048, 00:11:57.945 "data_size": 63488 00:11:57.945 }, 00:11:57.945 { 00:11:57.945 "name": "pt2", 00:11:57.945 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.945 "is_configured": true, 00:11:57.945 "data_offset": 2048, 00:11:57.945 "data_size": 63488 00:11:57.945 } 00:11:57.945 ] 00:11:57.945 } 00:11:57.945 } 00:11:57.945 }' 00:11:57.945 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:57.945 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:57.945 pt2' 00:11:57.945 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:57.945 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:57.945 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.203 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.203 "name": "pt1", 00:11:58.203 "aliases": [ 00:11:58.203 "00000000-0000-0000-0000-000000000001" 00:11:58.203 ], 00:11:58.203 "product_name": "passthru", 00:11:58.203 "block_size": 512, 00:11:58.203 "num_blocks": 65536, 00:11:58.203 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.203 "assigned_rate_limits": { 00:11:58.203 "rw_ios_per_sec": 0, 00:11:58.203 "rw_mbytes_per_sec": 0, 00:11:58.203 "r_mbytes_per_sec": 0, 00:11:58.203 "w_mbytes_per_sec": 0 00:11:58.203 }, 00:11:58.203 "claimed": true, 00:11:58.203 "claim_type": "exclusive_write", 00:11:58.203 "zoned": false, 00:11:58.203 "supported_io_types": { 00:11:58.203 "read": true, 00:11:58.203 "write": true, 00:11:58.203 "unmap": true, 00:11:58.203 "flush": true, 00:11:58.203 "reset": true, 00:11:58.203 "nvme_admin": false, 00:11:58.203 "nvme_io": false, 00:11:58.203 "nvme_io_md": false, 00:11:58.203 "write_zeroes": true, 00:11:58.203 "zcopy": true, 00:11:58.203 "get_zone_info": false, 00:11:58.203 "zone_management": false, 00:11:58.203 "zone_append": false, 00:11:58.203 "compare": false, 00:11:58.203 "compare_and_write": false, 00:11:58.203 "abort": true, 00:11:58.203 "seek_hole": false, 00:11:58.203 "seek_data": false, 00:11:58.203 "copy": true, 00:11:58.203 "nvme_iov_md": false 00:11:58.203 }, 00:11:58.203 "memory_domains": [ 00:11:58.203 { 00:11:58.203 "dma_device_id": "system", 00:11:58.203 "dma_device_type": 1 00:11:58.203 }, 00:11:58.203 { 00:11:58.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.203 "dma_device_type": 2 00:11:58.203 } 00:11:58.203 ], 00:11:58.203 "driver_specific": { 00:11:58.203 "passthru": { 00:11:58.203 "name": "pt1", 00:11:58.203 "base_bdev_name": "malloc1" 00:11:58.203 } 00:11:58.203 } 00:11:58.203 }' 00:11:58.203 09:16:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.203 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.203 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.203 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.203 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:58.461 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.719 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.719 "name": "pt2", 00:11:58.719 "aliases": [ 00:11:58.719 "00000000-0000-0000-0000-000000000002" 00:11:58.719 ], 00:11:58.719 "product_name": "passthru", 00:11:58.719 "block_size": 512, 00:11:58.719 "num_blocks": 65536, 00:11:58.719 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.719 "assigned_rate_limits": { 00:11:58.719 "rw_ios_per_sec": 0, 00:11:58.719 "rw_mbytes_per_sec": 0, 00:11:58.719 "r_mbytes_per_sec": 0, 00:11:58.719 "w_mbytes_per_sec": 0 00:11:58.719 }, 00:11:58.719 "claimed": true, 00:11:58.719 "claim_type": "exclusive_write", 00:11:58.719 "zoned": false, 00:11:58.719 "supported_io_types": { 00:11:58.719 "read": true, 00:11:58.719 "write": true, 00:11:58.719 "unmap": true, 00:11:58.719 "flush": true, 00:11:58.719 "reset": true, 00:11:58.719 "nvme_admin": false, 00:11:58.719 "nvme_io": false, 00:11:58.719 "nvme_io_md": false, 00:11:58.719 "write_zeroes": true, 00:11:58.719 "zcopy": true, 00:11:58.719 "get_zone_info": false, 00:11:58.719 "zone_management": false, 00:11:58.719 "zone_append": false, 00:11:58.719 "compare": false, 00:11:58.719 "compare_and_write": false, 00:11:58.719 "abort": true, 00:11:58.719 "seek_hole": false, 00:11:58.719 "seek_data": false, 00:11:58.719 "copy": true, 00:11:58.719 "nvme_iov_md": false 00:11:58.719 }, 00:11:58.719 "memory_domains": [ 00:11:58.719 { 00:11:58.719 "dma_device_id": "system", 00:11:58.719 "dma_device_type": 1 00:11:58.719 }, 00:11:58.719 { 00:11:58.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.719 "dma_device_type": 2 00:11:58.719 } 00:11:58.719 ], 00:11:58.719 "driver_specific": { 00:11:58.719 "passthru": { 00:11:58.719 "name": "pt2", 00:11:58.719 "base_bdev_name": "malloc2" 00:11:58.719 } 00:11:58.719 } 00:11:58.719 }' 00:11:58.719 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.719 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.977 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.254 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.254 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:59.254 09:16:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:59.254 [2024-07-15 09:16:08.167619] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.254 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d9352c5c-5817-4216-8c08-214c2fb1b6e4 00:11:59.254 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d9352c5c-5817-4216-8c08-214c2fb1b6e4 ']' 00:11:59.254 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:59.511 [2024-07-15 09:16:08.416044] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:59.511 [2024-07-15 09:16:08.416066] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.511 [2024-07-15 09:16:08.416122] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.511 [2024-07-15 09:16:08.416168] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:59.511 [2024-07-15 09:16:08.416181] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9dc270 name raid_bdev1, state offline 00:11:59.511 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.511 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:59.768 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:59.768 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:59.768 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:59.768 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:00.026 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:00.026 09:16:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:00.284 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:00.284 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:00.542 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:00.542 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.542 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:00.542 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:00.543 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.801 [2024-07-15 09:16:09.631224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:00.801 [2024-07-15 09:16:09.632610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:00.801 [2024-07-15 09:16:09.632666] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:00.801 [2024-07-15 09:16:09.632709] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:00.801 [2024-07-15 09:16:09.632729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.801 [2024-07-15 09:16:09.632742] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9dbff0 name raid_bdev1, state configuring 00:12:00.801 request: 00:12:00.801 { 00:12:00.801 "name": "raid_bdev1", 00:12:00.801 "raid_level": "concat", 00:12:00.801 "base_bdevs": [ 00:12:00.801 "malloc1", 00:12:00.801 "malloc2" 00:12:00.801 ], 00:12:00.801 "strip_size_kb": 64, 00:12:00.801 "superblock": false, 00:12:00.801 "method": "bdev_raid_create", 00:12:00.801 "req_id": 1 00:12:00.801 } 00:12:00.801 Got JSON-RPC error response 00:12:00.801 response: 00:12:00.801 { 00:12:00.801 "code": -17, 00:12:00.801 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:00.801 } 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.801 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:01.059 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:01.059 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:01.059 09:16:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:01.318 [2024-07-15 09:16:10.112449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:01.318 [2024-07-15 09:16:10.112500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:01.318 [2024-07-15 09:16:10.112521] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8387a0 00:12:01.318 [2024-07-15 09:16:10.112534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:01.318 [2024-07-15 09:16:10.114126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:01.318 [2024-07-15 09:16:10.114154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:01.318 [2024-07-15 09:16:10.114223] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:01.318 [2024-07-15 09:16:10.114248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:01.318 pt1 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.318 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.577 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.577 "name": "raid_bdev1", 00:12:01.577 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:12:01.577 "strip_size_kb": 64, 00:12:01.577 "state": "configuring", 00:12:01.577 "raid_level": "concat", 00:12:01.577 "superblock": true, 00:12:01.577 "num_base_bdevs": 2, 00:12:01.577 "num_base_bdevs_discovered": 1, 00:12:01.577 "num_base_bdevs_operational": 2, 00:12:01.577 "base_bdevs_list": [ 00:12:01.577 { 00:12:01.577 "name": "pt1", 00:12:01.577 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:01.577 "is_configured": true, 00:12:01.577 "data_offset": 2048, 00:12:01.577 "data_size": 63488 00:12:01.577 }, 00:12:01.577 { 00:12:01.577 "name": null, 00:12:01.577 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:01.577 "is_configured": false, 00:12:01.577 "data_offset": 2048, 00:12:01.577 "data_size": 63488 00:12:01.577 } 00:12:01.577 ] 00:12:01.577 }' 00:12:01.577 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.577 09:16:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.185 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:02.185 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:02.185 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:02.185 09:16:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:02.444 [2024-07-15 09:16:11.203329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:02.444 [2024-07-15 09:16:11.203377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.444 [2024-07-15 09:16:11.203396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9d2820 00:12:02.444 [2024-07-15 09:16:11.203408] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.444 [2024-07-15 09:16:11.203757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.444 [2024-07-15 09:16:11.203775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:02.444 [2024-07-15 09:16:11.203842] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:02.444 [2024-07-15 09:16:11.203861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:02.444 [2024-07-15 09:16:11.203964] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x82eec0 00:12:02.444 [2024-07-15 09:16:11.203975] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:02.444 [2024-07-15 09:16:11.204145] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x82ff00 00:12:02.444 [2024-07-15 09:16:11.204272] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x82eec0 00:12:02.444 [2024-07-15 09:16:11.204282] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x82eec0 00:12:02.444 [2024-07-15 09:16:11.204381] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.444 pt2 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.444 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:02.703 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.703 "name": "raid_bdev1", 00:12:02.703 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:12:02.703 "strip_size_kb": 64, 00:12:02.703 "state": "online", 00:12:02.703 "raid_level": "concat", 00:12:02.703 "superblock": true, 00:12:02.703 "num_base_bdevs": 2, 00:12:02.703 "num_base_bdevs_discovered": 2, 00:12:02.703 "num_base_bdevs_operational": 2, 00:12:02.703 "base_bdevs_list": [ 00:12:02.703 { 00:12:02.703 "name": "pt1", 00:12:02.703 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.703 "is_configured": true, 00:12:02.703 "data_offset": 2048, 00:12:02.703 "data_size": 63488 00:12:02.703 }, 00:12:02.703 { 00:12:02.703 "name": "pt2", 00:12:02.703 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.703 "is_configured": true, 00:12:02.703 "data_offset": 2048, 00:12:02.703 "data_size": 63488 00:12:02.703 } 00:12:02.703 ] 00:12:02.703 }' 00:12:02.703 09:16:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.703 09:16:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:03.269 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:03.528 [2024-07-15 09:16:12.282448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:03.528 "name": "raid_bdev1", 00:12:03.528 "aliases": [ 00:12:03.528 "d9352c5c-5817-4216-8c08-214c2fb1b6e4" 00:12:03.528 ], 00:12:03.528 "product_name": "Raid Volume", 00:12:03.528 "block_size": 512, 00:12:03.528 "num_blocks": 126976, 00:12:03.528 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:12:03.528 "assigned_rate_limits": { 00:12:03.528 "rw_ios_per_sec": 0, 00:12:03.528 "rw_mbytes_per_sec": 0, 00:12:03.528 "r_mbytes_per_sec": 0, 00:12:03.528 "w_mbytes_per_sec": 0 00:12:03.528 }, 00:12:03.528 "claimed": false, 00:12:03.528 "zoned": false, 00:12:03.528 "supported_io_types": { 00:12:03.528 "read": true, 00:12:03.528 "write": true, 00:12:03.528 "unmap": true, 00:12:03.528 "flush": true, 00:12:03.528 "reset": true, 00:12:03.528 "nvme_admin": false, 00:12:03.528 "nvme_io": false, 00:12:03.528 "nvme_io_md": false, 00:12:03.528 "write_zeroes": true, 00:12:03.528 "zcopy": false, 00:12:03.528 "get_zone_info": false, 00:12:03.528 "zone_management": false, 00:12:03.528 "zone_append": false, 00:12:03.528 "compare": false, 00:12:03.528 "compare_and_write": false, 00:12:03.528 "abort": false, 00:12:03.528 "seek_hole": false, 00:12:03.528 "seek_data": false, 00:12:03.528 "copy": false, 00:12:03.528 "nvme_iov_md": false 00:12:03.528 }, 00:12:03.528 "memory_domains": [ 00:12:03.528 { 00:12:03.528 "dma_device_id": "system", 00:12:03.528 "dma_device_type": 1 00:12:03.528 }, 00:12:03.528 { 00:12:03.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.528 "dma_device_type": 2 00:12:03.528 }, 00:12:03.528 { 00:12:03.528 "dma_device_id": "system", 00:12:03.528 "dma_device_type": 1 00:12:03.528 }, 00:12:03.528 { 00:12:03.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.528 "dma_device_type": 2 00:12:03.528 } 00:12:03.528 ], 00:12:03.528 "driver_specific": { 00:12:03.528 "raid": { 00:12:03.528 "uuid": "d9352c5c-5817-4216-8c08-214c2fb1b6e4", 00:12:03.528 "strip_size_kb": 64, 00:12:03.528 "state": "online", 00:12:03.528 "raid_level": "concat", 00:12:03.528 "superblock": true, 00:12:03.528 "num_base_bdevs": 2, 00:12:03.528 "num_base_bdevs_discovered": 2, 00:12:03.528 "num_base_bdevs_operational": 2, 00:12:03.528 "base_bdevs_list": [ 00:12:03.528 { 00:12:03.528 "name": "pt1", 00:12:03.528 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.528 "is_configured": true, 00:12:03.528 "data_offset": 2048, 00:12:03.528 "data_size": 63488 00:12:03.528 }, 00:12:03.528 { 00:12:03.528 "name": "pt2", 00:12:03.528 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.528 "is_configured": true, 00:12:03.528 "data_offset": 2048, 00:12:03.528 "data_size": 63488 00:12:03.528 } 00:12:03.528 ] 00:12:03.528 } 00:12:03.528 } 00:12:03.528 }' 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:03.528 pt2' 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:03.528 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.787 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.787 "name": "pt1", 00:12:03.787 "aliases": [ 00:12:03.787 "00000000-0000-0000-0000-000000000001" 00:12:03.787 ], 00:12:03.787 "product_name": "passthru", 00:12:03.787 "block_size": 512, 00:12:03.787 "num_blocks": 65536, 00:12:03.787 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.787 "assigned_rate_limits": { 00:12:03.787 "rw_ios_per_sec": 0, 00:12:03.787 "rw_mbytes_per_sec": 0, 00:12:03.787 "r_mbytes_per_sec": 0, 00:12:03.787 "w_mbytes_per_sec": 0 00:12:03.787 }, 00:12:03.787 "claimed": true, 00:12:03.787 "claim_type": "exclusive_write", 00:12:03.788 "zoned": false, 00:12:03.788 "supported_io_types": { 00:12:03.788 "read": true, 00:12:03.788 "write": true, 00:12:03.788 "unmap": true, 00:12:03.788 "flush": true, 00:12:03.788 "reset": true, 00:12:03.788 "nvme_admin": false, 00:12:03.788 "nvme_io": false, 00:12:03.788 "nvme_io_md": false, 00:12:03.788 "write_zeroes": true, 00:12:03.788 "zcopy": true, 00:12:03.788 "get_zone_info": false, 00:12:03.788 "zone_management": false, 00:12:03.788 "zone_append": false, 00:12:03.788 "compare": false, 00:12:03.788 "compare_and_write": false, 00:12:03.788 "abort": true, 00:12:03.788 "seek_hole": false, 00:12:03.788 "seek_data": false, 00:12:03.788 "copy": true, 00:12:03.788 "nvme_iov_md": false 00:12:03.788 }, 00:12:03.788 "memory_domains": [ 00:12:03.788 { 00:12:03.788 "dma_device_id": "system", 00:12:03.788 "dma_device_type": 1 00:12:03.788 }, 00:12:03.788 { 00:12:03.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.788 "dma_device_type": 2 00:12:03.788 } 00:12:03.788 ], 00:12:03.788 "driver_specific": { 00:12:03.788 "passthru": { 00:12:03.788 "name": "pt1", 00:12:03.788 "base_bdev_name": "malloc1" 00:12:03.788 } 00:12:03.788 } 00:12:03.788 }' 00:12:03.788 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.788 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.788 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.788 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.788 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:04.047 09:16:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.306 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.306 "name": "pt2", 00:12:04.306 "aliases": [ 00:12:04.306 "00000000-0000-0000-0000-000000000002" 00:12:04.306 ], 00:12:04.306 "product_name": "passthru", 00:12:04.306 "block_size": 512, 00:12:04.306 "num_blocks": 65536, 00:12:04.306 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:04.306 "assigned_rate_limits": { 00:12:04.306 "rw_ios_per_sec": 0, 00:12:04.306 "rw_mbytes_per_sec": 0, 00:12:04.306 "r_mbytes_per_sec": 0, 00:12:04.306 "w_mbytes_per_sec": 0 00:12:04.306 }, 00:12:04.306 "claimed": true, 00:12:04.306 "claim_type": "exclusive_write", 00:12:04.306 "zoned": false, 00:12:04.306 "supported_io_types": { 00:12:04.306 "read": true, 00:12:04.306 "write": true, 00:12:04.306 "unmap": true, 00:12:04.306 "flush": true, 00:12:04.306 "reset": true, 00:12:04.306 "nvme_admin": false, 00:12:04.306 "nvme_io": false, 00:12:04.306 "nvme_io_md": false, 00:12:04.306 "write_zeroes": true, 00:12:04.306 "zcopy": true, 00:12:04.306 "get_zone_info": false, 00:12:04.306 "zone_management": false, 00:12:04.306 "zone_append": false, 00:12:04.306 "compare": false, 00:12:04.306 "compare_and_write": false, 00:12:04.306 "abort": true, 00:12:04.306 "seek_hole": false, 00:12:04.306 "seek_data": false, 00:12:04.306 "copy": true, 00:12:04.306 "nvme_iov_md": false 00:12:04.306 }, 00:12:04.306 "memory_domains": [ 00:12:04.306 { 00:12:04.306 "dma_device_id": "system", 00:12:04.306 "dma_device_type": 1 00:12:04.306 }, 00:12:04.306 { 00:12:04.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.306 "dma_device_type": 2 00:12:04.306 } 00:12:04.306 ], 00:12:04.306 "driver_specific": { 00:12:04.306 "passthru": { 00:12:04.306 "name": "pt2", 00:12:04.306 "base_bdev_name": "malloc2" 00:12:04.306 } 00:12:04.306 } 00:12:04.306 }' 00:12:04.306 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.306 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.564 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.822 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.822 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:04.822 09:16:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:05.388 [2024-07-15 09:16:14.039141] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d9352c5c-5817-4216-8c08-214c2fb1b6e4 '!=' d9352c5c-5817-4216-8c08-214c2fb1b6e4 ']' 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 93456 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 93456 ']' 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 93456 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 93456 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 93456' 00:12:05.388 killing process with pid 93456 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 93456 00:12:05.388 [2024-07-15 09:16:14.124264] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:05.388 [2024-07-15 09:16:14.124317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:05.388 [2024-07-15 09:16:14.124360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:05.388 [2024-07-15 09:16:14.124372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x82eec0 name raid_bdev1, state offline 00:12:05.388 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 93456 00:12:05.388 [2024-07-15 09:16:14.143251] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:05.646 09:16:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:05.646 00:12:05.646 real 0m11.011s 00:12:05.646 user 0m19.660s 00:12:05.646 sys 0m2.003s 00:12:05.646 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:05.646 09:16:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.646 ************************************ 00:12:05.646 END TEST raid_superblock_test 00:12:05.646 ************************************ 00:12:05.646 09:16:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:05.647 09:16:14 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:05.647 09:16:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:05.647 09:16:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:05.647 09:16:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.647 ************************************ 00:12:05.647 START TEST raid_read_error_test 00:12:05.647 ************************************ 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.8MciA0Sl4s 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=95082 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 95082 /var/tmp/spdk-raid.sock 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 95082 ']' 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.647 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:05.647 09:16:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.647 [2024-07-15 09:16:14.507772] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:05.647 [2024-07-15 09:16:14.507822] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95082 ] 00:12:05.905 [2024-07-15 09:16:14.621912] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.905 [2024-07-15 09:16:14.726145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.905 [2024-07-15 09:16:14.780492] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.905 [2024-07-15 09:16:14.780523] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:06.839 09:16:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:06.839 09:16:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:06.839 09:16:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:06.839 09:16:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:06.839 BaseBdev1_malloc 00:12:06.839 09:16:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:07.097 true 00:12:07.097 09:16:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:07.355 [2024-07-15 09:16:16.176644] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:07.355 [2024-07-15 09:16:16.176693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.355 [2024-07-15 09:16:16.176715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14fd0d0 00:12:07.355 [2024-07-15 09:16:16.176727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.355 [2024-07-15 09:16:16.178659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.355 [2024-07-15 09:16:16.178687] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:07.355 BaseBdev1 00:12:07.355 09:16:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:07.355 09:16:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:07.613 BaseBdev2_malloc 00:12:07.613 09:16:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:07.871 true 00:12:07.871 09:16:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:08.129 [2024-07-15 09:16:16.855106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:08.129 [2024-07-15 09:16:16.855150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:08.129 [2024-07-15 09:16:16.855172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1501910 00:12:08.129 [2024-07-15 09:16:16.855184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:08.129 [2024-07-15 09:16:16.856753] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:08.129 [2024-07-15 09:16:16.856781] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:08.129 BaseBdev2 00:12:08.129 09:16:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:08.387 [2024-07-15 09:16:17.099783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:08.387 [2024-07-15 09:16:17.101155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:08.387 [2024-07-15 09:16:17.101349] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1503320 00:12:08.387 [2024-07-15 09:16:17.101362] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:08.387 [2024-07-15 09:16:17.101558] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1504290 00:12:08.387 [2024-07-15 09:16:17.101712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1503320 00:12:08.387 [2024-07-15 09:16:17.101722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1503320 00:12:08.387 [2024-07-15 09:16:17.101828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.387 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:08.645 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:08.645 "name": "raid_bdev1", 00:12:08.645 "uuid": "2485a2a6-dc79-4b7e-9d03-e8643b982725", 00:12:08.645 "strip_size_kb": 64, 00:12:08.645 "state": "online", 00:12:08.645 "raid_level": "concat", 00:12:08.645 "superblock": true, 00:12:08.645 "num_base_bdevs": 2, 00:12:08.645 "num_base_bdevs_discovered": 2, 00:12:08.645 "num_base_bdevs_operational": 2, 00:12:08.645 "base_bdevs_list": [ 00:12:08.645 { 00:12:08.645 "name": "BaseBdev1", 00:12:08.645 "uuid": "ddb57566-2442-5e87-a82f-c09f36c526cc", 00:12:08.645 "is_configured": true, 00:12:08.645 "data_offset": 2048, 00:12:08.645 "data_size": 63488 00:12:08.645 }, 00:12:08.645 { 00:12:08.645 "name": "BaseBdev2", 00:12:08.645 "uuid": "af587f48-61cd-510f-9af3-da6c4c2057ea", 00:12:08.645 "is_configured": true, 00:12:08.645 "data_offset": 2048, 00:12:08.645 "data_size": 63488 00:12:08.645 } 00:12:08.645 ] 00:12:08.645 }' 00:12:08.645 09:16:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:08.645 09:16:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.211 09:16:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:09.211 09:16:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:09.211 [2024-07-15 09:16:18.122791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14fe9b0 00:12:10.146 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.404 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:10.663 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.663 "name": "raid_bdev1", 00:12:10.663 "uuid": "2485a2a6-dc79-4b7e-9d03-e8643b982725", 00:12:10.663 "strip_size_kb": 64, 00:12:10.663 "state": "online", 00:12:10.663 "raid_level": "concat", 00:12:10.663 "superblock": true, 00:12:10.663 "num_base_bdevs": 2, 00:12:10.663 "num_base_bdevs_discovered": 2, 00:12:10.663 "num_base_bdevs_operational": 2, 00:12:10.663 "base_bdevs_list": [ 00:12:10.663 { 00:12:10.663 "name": "BaseBdev1", 00:12:10.663 "uuid": "ddb57566-2442-5e87-a82f-c09f36c526cc", 00:12:10.663 "is_configured": true, 00:12:10.663 "data_offset": 2048, 00:12:10.663 "data_size": 63488 00:12:10.663 }, 00:12:10.663 { 00:12:10.663 "name": "BaseBdev2", 00:12:10.663 "uuid": "af587f48-61cd-510f-9af3-da6c4c2057ea", 00:12:10.663 "is_configured": true, 00:12:10.663 "data_offset": 2048, 00:12:10.663 "data_size": 63488 00:12:10.663 } 00:12:10.663 ] 00:12:10.663 }' 00:12:10.663 09:16:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.663 09:16:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.230 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:11.488 [2024-07-15 09:16:20.270629] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:11.488 [2024-07-15 09:16:20.270659] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:11.488 [2024-07-15 09:16:20.273820] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:11.488 [2024-07-15 09:16:20.273852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:11.488 [2024-07-15 09:16:20.273879] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:11.488 [2024-07-15 09:16:20.273890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1503320 name raid_bdev1, state offline 00:12:11.488 0 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 95082 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 95082 ']' 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 95082 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 95082 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 95082' 00:12:11.488 killing process with pid 95082 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 95082 00:12:11.488 [2024-07-15 09:16:20.337190] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:11.488 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 95082 00:12:11.488 [2024-07-15 09:16:20.347579] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.8MciA0Sl4s 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:12:11.746 00:12:11.746 real 0m6.125s 00:12:11.746 user 0m9.560s 00:12:11.746 sys 0m1.040s 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:11.746 09:16:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.746 ************************************ 00:12:11.746 END TEST raid_read_error_test 00:12:11.746 ************************************ 00:12:11.746 09:16:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:11.746 09:16:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:11.746 09:16:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:11.746 09:16:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:11.746 09:16:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:11.746 ************************************ 00:12:11.746 START TEST raid_write_error_test 00:12:11.746 ************************************ 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.F0gitOG913 00:12:11.746 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=96055 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 96055 /var/tmp/spdk-raid.sock 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 96055 ']' 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:11.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:11.747 09:16:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.004 [2024-07-15 09:16:20.735805] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:12.004 [2024-07-15 09:16:20.735874] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96055 ] 00:12:12.004 [2024-07-15 09:16:20.863540] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:12.264 [2024-07-15 09:16:20.966481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:12.264 [2024-07-15 09:16:21.025372] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.264 [2024-07-15 09:16:21.025438] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.830 09:16:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:12.830 09:16:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:12.830 09:16:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:12.830 09:16:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:13.088 BaseBdev1_malloc 00:12:13.088 09:16:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:13.347 true 00:12:13.347 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:13.347 [2024-07-15 09:16:22.226517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:13.347 [2024-07-15 09:16:22.226561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.347 [2024-07-15 09:16:22.226582] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe00d0 00:12:13.347 [2024-07-15 09:16:22.226595] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.347 [2024-07-15 09:16:22.228444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.347 [2024-07-15 09:16:22.228473] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:13.347 BaseBdev1 00:12:13.347 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:13.347 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:13.605 BaseBdev2_malloc 00:12:13.605 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:13.863 true 00:12:13.863 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:13.863 [2024-07-15 09:16:22.784583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:13.863 [2024-07-15 09:16:22.784630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.863 [2024-07-15 09:16:22.784652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe4910 00:12:13.863 [2024-07-15 09:16:22.784665] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.863 [2024-07-15 09:16:22.786270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.863 [2024-07-15 09:16:22.786298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:13.863 BaseBdev2 00:12:13.863 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:14.121 [2024-07-15 09:16:22.957071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:14.121 [2024-07-15 09:16:22.958288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:14.121 [2024-07-15 09:16:22.958483] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe6320 00:12:14.121 [2024-07-15 09:16:22.958496] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:14.121 [2024-07-15 09:16:22.958675] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe7290 00:12:14.121 [2024-07-15 09:16:22.958816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe6320 00:12:14.121 [2024-07-15 09:16:22.958826] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbe6320 00:12:14.121 [2024-07-15 09:16:22.958937] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.121 09:16:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.379 09:16:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.379 "name": "raid_bdev1", 00:12:14.379 "uuid": "88236c1d-6996-46a5-bb59-d0ab0d3c074d", 00:12:14.379 "strip_size_kb": 64, 00:12:14.379 "state": "online", 00:12:14.379 "raid_level": "concat", 00:12:14.379 "superblock": true, 00:12:14.379 "num_base_bdevs": 2, 00:12:14.379 "num_base_bdevs_discovered": 2, 00:12:14.379 "num_base_bdevs_operational": 2, 00:12:14.379 "base_bdevs_list": [ 00:12:14.379 { 00:12:14.379 "name": "BaseBdev1", 00:12:14.379 "uuid": "486df479-d808-54ec-89e4-4f046a089d50", 00:12:14.379 "is_configured": true, 00:12:14.379 "data_offset": 2048, 00:12:14.379 "data_size": 63488 00:12:14.379 }, 00:12:14.379 { 00:12:14.379 "name": "BaseBdev2", 00:12:14.379 "uuid": "0143e153-9843-5ab6-a74d-3c3048f5ff2f", 00:12:14.380 "is_configured": true, 00:12:14.380 "data_offset": 2048, 00:12:14.380 "data_size": 63488 00:12:14.380 } 00:12:14.380 ] 00:12:14.380 }' 00:12:14.380 09:16:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.380 09:16:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.944 09:16:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:14.944 09:16:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:15.203 [2024-07-15 09:16:23.927975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe19b0 00:12:16.139 09:16:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.139 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:16.397 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.397 "name": "raid_bdev1", 00:12:16.397 "uuid": "88236c1d-6996-46a5-bb59-d0ab0d3c074d", 00:12:16.397 "strip_size_kb": 64, 00:12:16.397 "state": "online", 00:12:16.397 "raid_level": "concat", 00:12:16.397 "superblock": true, 00:12:16.397 "num_base_bdevs": 2, 00:12:16.397 "num_base_bdevs_discovered": 2, 00:12:16.397 "num_base_bdevs_operational": 2, 00:12:16.397 "base_bdevs_list": [ 00:12:16.397 { 00:12:16.397 "name": "BaseBdev1", 00:12:16.397 "uuid": "486df479-d808-54ec-89e4-4f046a089d50", 00:12:16.397 "is_configured": true, 00:12:16.397 "data_offset": 2048, 00:12:16.397 "data_size": 63488 00:12:16.397 }, 00:12:16.397 { 00:12:16.397 "name": "BaseBdev2", 00:12:16.397 "uuid": "0143e153-9843-5ab6-a74d-3c3048f5ff2f", 00:12:16.397 "is_configured": true, 00:12:16.397 "data_offset": 2048, 00:12:16.397 "data_size": 63488 00:12:16.397 } 00:12:16.397 ] 00:12:16.397 }' 00:12:16.397 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.397 09:16:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.963 09:16:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:17.221 [2024-07-15 09:16:26.128992] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:17.221 [2024-07-15 09:16:26.129036] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:17.221 [2024-07-15 09:16:26.132190] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.221 [2024-07-15 09:16:26.132221] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.221 [2024-07-15 09:16:26.132249] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.221 [2024-07-15 09:16:26.132260] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe6320 name raid_bdev1, state offline 00:12:17.221 0 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 96055 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 96055 ']' 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 96055 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:17.221 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 96055 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 96055' 00:12:17.479 killing process with pid 96055 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 96055 00:12:17.479 [2024-07-15 09:16:26.195970] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 96055 00:12:17.479 [2024-07-15 09:16:26.206510] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.F0gitOG913 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:17.479 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:12:17.737 00:12:17.737 real 0m5.777s 00:12:17.737 user 0m8.921s 00:12:17.737 sys 0m0.991s 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.737 09:16:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.737 ************************************ 00:12:17.737 END TEST raid_write_error_test 00:12:17.737 ************************************ 00:12:17.737 09:16:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:17.737 09:16:26 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:17.737 09:16:26 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:17.737 09:16:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:17.737 09:16:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.737 09:16:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:17.737 ************************************ 00:12:17.737 START TEST raid_state_function_test 00:12:17.737 ************************************ 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=96881 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 96881' 00:12:17.737 Process raid pid: 96881 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 96881 /var/tmp/spdk-raid.sock 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 96881 ']' 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:17.737 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:17.737 09:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.737 [2024-07-15 09:16:26.592483] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:17.737 [2024-07-15 09:16:26.592547] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:17.994 [2024-07-15 09:16:26.723823] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:17.994 [2024-07-15 09:16:26.825623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:17.994 [2024-07-15 09:16:26.886914] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.994 [2024-07-15 09:16:26.886957] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.650 09:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:18.650 09:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:18.650 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:18.908 [2024-07-15 09:16:27.743041] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:18.908 [2024-07-15 09:16:27.743087] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:18.908 [2024-07-15 09:16:27.743098] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:18.908 [2024-07-15 09:16:27.743110] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.908 09:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.165 09:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.165 "name": "Existed_Raid", 00:12:19.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.165 "strip_size_kb": 0, 00:12:19.165 "state": "configuring", 00:12:19.165 "raid_level": "raid1", 00:12:19.165 "superblock": false, 00:12:19.165 "num_base_bdevs": 2, 00:12:19.165 "num_base_bdevs_discovered": 0, 00:12:19.165 "num_base_bdevs_operational": 2, 00:12:19.165 "base_bdevs_list": [ 00:12:19.165 { 00:12:19.165 "name": "BaseBdev1", 00:12:19.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.165 "is_configured": false, 00:12:19.165 "data_offset": 0, 00:12:19.165 "data_size": 0 00:12:19.165 }, 00:12:19.165 { 00:12:19.165 "name": "BaseBdev2", 00:12:19.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.165 "is_configured": false, 00:12:19.165 "data_offset": 0, 00:12:19.165 "data_size": 0 00:12:19.165 } 00:12:19.165 ] 00:12:19.165 }' 00:12:19.165 09:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.165 09:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.730 09:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:19.987 [2024-07-15 09:16:28.809838] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:19.987 [2024-07-15 09:16:28.809871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253fa80 name Existed_Raid, state configuring 00:12:19.987 09:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:20.245 [2024-07-15 09:16:29.054483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:20.245 [2024-07-15 09:16:29.054514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:20.245 [2024-07-15 09:16:29.054524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:20.245 [2024-07-15 09:16:29.054535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:20.245 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:20.503 [2024-07-15 09:16:29.308880] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:20.503 BaseBdev1 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:20.503 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:20.761 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:21.017 [ 00:12:21.017 { 00:12:21.017 "name": "BaseBdev1", 00:12:21.017 "aliases": [ 00:12:21.017 "365daa10-264d-41b3-a3e5-fdc4c322310f" 00:12:21.017 ], 00:12:21.017 "product_name": "Malloc disk", 00:12:21.017 "block_size": 512, 00:12:21.017 "num_blocks": 65536, 00:12:21.017 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:21.017 "assigned_rate_limits": { 00:12:21.017 "rw_ios_per_sec": 0, 00:12:21.017 "rw_mbytes_per_sec": 0, 00:12:21.017 "r_mbytes_per_sec": 0, 00:12:21.017 "w_mbytes_per_sec": 0 00:12:21.017 }, 00:12:21.017 "claimed": true, 00:12:21.017 "claim_type": "exclusive_write", 00:12:21.017 "zoned": false, 00:12:21.017 "supported_io_types": { 00:12:21.017 "read": true, 00:12:21.017 "write": true, 00:12:21.017 "unmap": true, 00:12:21.017 "flush": true, 00:12:21.017 "reset": true, 00:12:21.017 "nvme_admin": false, 00:12:21.017 "nvme_io": false, 00:12:21.017 "nvme_io_md": false, 00:12:21.017 "write_zeroes": true, 00:12:21.017 "zcopy": true, 00:12:21.017 "get_zone_info": false, 00:12:21.017 "zone_management": false, 00:12:21.017 "zone_append": false, 00:12:21.017 "compare": false, 00:12:21.017 "compare_and_write": false, 00:12:21.017 "abort": true, 00:12:21.017 "seek_hole": false, 00:12:21.017 "seek_data": false, 00:12:21.017 "copy": true, 00:12:21.017 "nvme_iov_md": false 00:12:21.017 }, 00:12:21.017 "memory_domains": [ 00:12:21.017 { 00:12:21.017 "dma_device_id": "system", 00:12:21.017 "dma_device_type": 1 00:12:21.017 }, 00:12:21.017 { 00:12:21.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.017 "dma_device_type": 2 00:12:21.017 } 00:12:21.017 ], 00:12:21.017 "driver_specific": {} 00:12:21.017 } 00:12:21.017 ] 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.017 09:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:21.273 09:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.273 "name": "Existed_Raid", 00:12:21.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.273 "strip_size_kb": 0, 00:12:21.273 "state": "configuring", 00:12:21.273 "raid_level": "raid1", 00:12:21.273 "superblock": false, 00:12:21.273 "num_base_bdevs": 2, 00:12:21.273 "num_base_bdevs_discovered": 1, 00:12:21.273 "num_base_bdevs_operational": 2, 00:12:21.273 "base_bdevs_list": [ 00:12:21.273 { 00:12:21.273 "name": "BaseBdev1", 00:12:21.273 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:21.273 "is_configured": true, 00:12:21.273 "data_offset": 0, 00:12:21.273 "data_size": 65536 00:12:21.273 }, 00:12:21.273 { 00:12:21.273 "name": "BaseBdev2", 00:12:21.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.273 "is_configured": false, 00:12:21.273 "data_offset": 0, 00:12:21.273 "data_size": 0 00:12:21.273 } 00:12:21.273 ] 00:12:21.273 }' 00:12:21.273 09:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.273 09:16:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.838 09:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:22.096 [2024-07-15 09:16:30.812890] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:22.096 [2024-07-15 09:16:30.812934] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253f350 name Existed_Raid, state configuring 00:12:22.096 09:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:22.353 [2024-07-15 09:16:31.057563] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:22.353 [2024-07-15 09:16:31.059141] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:22.353 [2024-07-15 09:16:31.059174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:22.353 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:22.353 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.354 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.612 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.612 "name": "Existed_Raid", 00:12:22.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.612 "strip_size_kb": 0, 00:12:22.612 "state": "configuring", 00:12:22.612 "raid_level": "raid1", 00:12:22.612 "superblock": false, 00:12:22.612 "num_base_bdevs": 2, 00:12:22.612 "num_base_bdevs_discovered": 1, 00:12:22.612 "num_base_bdevs_operational": 2, 00:12:22.612 "base_bdevs_list": [ 00:12:22.612 { 00:12:22.612 "name": "BaseBdev1", 00:12:22.612 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:22.612 "is_configured": true, 00:12:22.612 "data_offset": 0, 00:12:22.612 "data_size": 65536 00:12:22.612 }, 00:12:22.612 { 00:12:22.612 "name": "BaseBdev2", 00:12:22.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.612 "is_configured": false, 00:12:22.612 "data_offset": 0, 00:12:22.612 "data_size": 0 00:12:22.612 } 00:12:22.612 ] 00:12:22.612 }' 00:12:22.612 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.612 09:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.180 09:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:23.180 [2024-07-15 09:16:32.131959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:23.180 [2024-07-15 09:16:32.131997] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2540000 00:12:23.180 [2024-07-15 09:16:32.132006] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:23.180 [2024-07-15 09:16:32.132197] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x245a0c0 00:12:23.180 [2024-07-15 09:16:32.132317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2540000 00:12:23.180 [2024-07-15 09:16:32.132329] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2540000 00:12:23.180 [2024-07-15 09:16:32.132506] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:23.438 BaseBdev2 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:23.438 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:23.696 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:23.696 [ 00:12:23.696 { 00:12:23.696 "name": "BaseBdev2", 00:12:23.696 "aliases": [ 00:12:23.696 "5d61ffc6-26de-43f9-b56d-ec99b1e197ed" 00:12:23.696 ], 00:12:23.696 "product_name": "Malloc disk", 00:12:23.696 "block_size": 512, 00:12:23.696 "num_blocks": 65536, 00:12:23.696 "uuid": "5d61ffc6-26de-43f9-b56d-ec99b1e197ed", 00:12:23.696 "assigned_rate_limits": { 00:12:23.696 "rw_ios_per_sec": 0, 00:12:23.696 "rw_mbytes_per_sec": 0, 00:12:23.696 "r_mbytes_per_sec": 0, 00:12:23.696 "w_mbytes_per_sec": 0 00:12:23.696 }, 00:12:23.696 "claimed": true, 00:12:23.696 "claim_type": "exclusive_write", 00:12:23.696 "zoned": false, 00:12:23.696 "supported_io_types": { 00:12:23.696 "read": true, 00:12:23.696 "write": true, 00:12:23.696 "unmap": true, 00:12:23.696 "flush": true, 00:12:23.696 "reset": true, 00:12:23.696 "nvme_admin": false, 00:12:23.696 "nvme_io": false, 00:12:23.696 "nvme_io_md": false, 00:12:23.696 "write_zeroes": true, 00:12:23.696 "zcopy": true, 00:12:23.696 "get_zone_info": false, 00:12:23.696 "zone_management": false, 00:12:23.696 "zone_append": false, 00:12:23.696 "compare": false, 00:12:23.696 "compare_and_write": false, 00:12:23.696 "abort": true, 00:12:23.696 "seek_hole": false, 00:12:23.696 "seek_data": false, 00:12:23.696 "copy": true, 00:12:23.696 "nvme_iov_md": false 00:12:23.696 }, 00:12:23.696 "memory_domains": [ 00:12:23.696 { 00:12:23.696 "dma_device_id": "system", 00:12:23.696 "dma_device_type": 1 00:12:23.696 }, 00:12:23.696 { 00:12:23.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.696 "dma_device_type": 2 00:12:23.696 } 00:12:23.696 ], 00:12:23.697 "driver_specific": {} 00:12:23.697 } 00:12:23.697 ] 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.697 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.955 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.955 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.955 "name": "Existed_Raid", 00:12:23.955 "uuid": "c5c1be7b-4fbf-4f31-985b-53737274d32e", 00:12:23.955 "strip_size_kb": 0, 00:12:23.955 "state": "online", 00:12:23.955 "raid_level": "raid1", 00:12:23.955 "superblock": false, 00:12:23.955 "num_base_bdevs": 2, 00:12:23.955 "num_base_bdevs_discovered": 2, 00:12:23.955 "num_base_bdevs_operational": 2, 00:12:23.955 "base_bdevs_list": [ 00:12:23.955 { 00:12:23.955 "name": "BaseBdev1", 00:12:23.955 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:23.955 "is_configured": true, 00:12:23.955 "data_offset": 0, 00:12:23.955 "data_size": 65536 00:12:23.955 }, 00:12:23.955 { 00:12:23.955 "name": "BaseBdev2", 00:12:23.955 "uuid": "5d61ffc6-26de-43f9-b56d-ec99b1e197ed", 00:12:23.955 "is_configured": true, 00:12:23.955 "data_offset": 0, 00:12:23.955 "data_size": 65536 00:12:23.955 } 00:12:23.955 ] 00:12:23.955 }' 00:12:23.955 09:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.955 09:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.522 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:24.778 [2024-07-15 09:16:33.692378] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:24.778 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:24.778 "name": "Existed_Raid", 00:12:24.778 "aliases": [ 00:12:24.778 "c5c1be7b-4fbf-4f31-985b-53737274d32e" 00:12:24.778 ], 00:12:24.778 "product_name": "Raid Volume", 00:12:24.778 "block_size": 512, 00:12:24.778 "num_blocks": 65536, 00:12:24.778 "uuid": "c5c1be7b-4fbf-4f31-985b-53737274d32e", 00:12:24.778 "assigned_rate_limits": { 00:12:24.778 "rw_ios_per_sec": 0, 00:12:24.778 "rw_mbytes_per_sec": 0, 00:12:24.778 "r_mbytes_per_sec": 0, 00:12:24.778 "w_mbytes_per_sec": 0 00:12:24.778 }, 00:12:24.778 "claimed": false, 00:12:24.778 "zoned": false, 00:12:24.778 "supported_io_types": { 00:12:24.778 "read": true, 00:12:24.778 "write": true, 00:12:24.778 "unmap": false, 00:12:24.778 "flush": false, 00:12:24.778 "reset": true, 00:12:24.778 "nvme_admin": false, 00:12:24.778 "nvme_io": false, 00:12:24.778 "nvme_io_md": false, 00:12:24.778 "write_zeroes": true, 00:12:24.778 "zcopy": false, 00:12:24.778 "get_zone_info": false, 00:12:24.778 "zone_management": false, 00:12:24.778 "zone_append": false, 00:12:24.778 "compare": false, 00:12:24.778 "compare_and_write": false, 00:12:24.778 "abort": false, 00:12:24.778 "seek_hole": false, 00:12:24.778 "seek_data": false, 00:12:24.778 "copy": false, 00:12:24.778 "nvme_iov_md": false 00:12:24.778 }, 00:12:24.778 "memory_domains": [ 00:12:24.778 { 00:12:24.778 "dma_device_id": "system", 00:12:24.778 "dma_device_type": 1 00:12:24.778 }, 00:12:24.778 { 00:12:24.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.778 "dma_device_type": 2 00:12:24.778 }, 00:12:24.778 { 00:12:24.778 "dma_device_id": "system", 00:12:24.778 "dma_device_type": 1 00:12:24.778 }, 00:12:24.778 { 00:12:24.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.778 "dma_device_type": 2 00:12:24.778 } 00:12:24.778 ], 00:12:24.778 "driver_specific": { 00:12:24.778 "raid": { 00:12:24.778 "uuid": "c5c1be7b-4fbf-4f31-985b-53737274d32e", 00:12:24.778 "strip_size_kb": 0, 00:12:24.778 "state": "online", 00:12:24.778 "raid_level": "raid1", 00:12:24.778 "superblock": false, 00:12:24.778 "num_base_bdevs": 2, 00:12:24.778 "num_base_bdevs_discovered": 2, 00:12:24.779 "num_base_bdevs_operational": 2, 00:12:24.779 "base_bdevs_list": [ 00:12:24.779 { 00:12:24.779 "name": "BaseBdev1", 00:12:24.779 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:24.779 "is_configured": true, 00:12:24.779 "data_offset": 0, 00:12:24.779 "data_size": 65536 00:12:24.779 }, 00:12:24.779 { 00:12:24.779 "name": "BaseBdev2", 00:12:24.779 "uuid": "5d61ffc6-26de-43f9-b56d-ec99b1e197ed", 00:12:24.779 "is_configured": true, 00:12:24.779 "data_offset": 0, 00:12:24.779 "data_size": 65536 00:12:24.779 } 00:12:24.779 ] 00:12:24.779 } 00:12:24.779 } 00:12:24.779 }' 00:12:24.779 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:25.036 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:25.036 BaseBdev2' 00:12:25.036 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:25.036 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:25.036 09:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:25.294 "name": "BaseBdev1", 00:12:25.294 "aliases": [ 00:12:25.294 "365daa10-264d-41b3-a3e5-fdc4c322310f" 00:12:25.294 ], 00:12:25.294 "product_name": "Malloc disk", 00:12:25.294 "block_size": 512, 00:12:25.294 "num_blocks": 65536, 00:12:25.294 "uuid": "365daa10-264d-41b3-a3e5-fdc4c322310f", 00:12:25.294 "assigned_rate_limits": { 00:12:25.294 "rw_ios_per_sec": 0, 00:12:25.294 "rw_mbytes_per_sec": 0, 00:12:25.294 "r_mbytes_per_sec": 0, 00:12:25.294 "w_mbytes_per_sec": 0 00:12:25.294 }, 00:12:25.294 "claimed": true, 00:12:25.294 "claim_type": "exclusive_write", 00:12:25.294 "zoned": false, 00:12:25.294 "supported_io_types": { 00:12:25.294 "read": true, 00:12:25.294 "write": true, 00:12:25.294 "unmap": true, 00:12:25.294 "flush": true, 00:12:25.294 "reset": true, 00:12:25.294 "nvme_admin": false, 00:12:25.294 "nvme_io": false, 00:12:25.294 "nvme_io_md": false, 00:12:25.294 "write_zeroes": true, 00:12:25.294 "zcopy": true, 00:12:25.294 "get_zone_info": false, 00:12:25.294 "zone_management": false, 00:12:25.294 "zone_append": false, 00:12:25.294 "compare": false, 00:12:25.294 "compare_and_write": false, 00:12:25.294 "abort": true, 00:12:25.294 "seek_hole": false, 00:12:25.294 "seek_data": false, 00:12:25.294 "copy": true, 00:12:25.294 "nvme_iov_md": false 00:12:25.294 }, 00:12:25.294 "memory_domains": [ 00:12:25.294 { 00:12:25.294 "dma_device_id": "system", 00:12:25.294 "dma_device_type": 1 00:12:25.294 }, 00:12:25.294 { 00:12:25.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.294 "dma_device_type": 2 00:12:25.294 } 00:12:25.294 ], 00:12:25.294 "driver_specific": {} 00:12:25.294 }' 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.294 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:25.552 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:25.809 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:25.809 "name": "BaseBdev2", 00:12:25.809 "aliases": [ 00:12:25.809 "5d61ffc6-26de-43f9-b56d-ec99b1e197ed" 00:12:25.809 ], 00:12:25.809 "product_name": "Malloc disk", 00:12:25.809 "block_size": 512, 00:12:25.809 "num_blocks": 65536, 00:12:25.809 "uuid": "5d61ffc6-26de-43f9-b56d-ec99b1e197ed", 00:12:25.809 "assigned_rate_limits": { 00:12:25.809 "rw_ios_per_sec": 0, 00:12:25.809 "rw_mbytes_per_sec": 0, 00:12:25.809 "r_mbytes_per_sec": 0, 00:12:25.809 "w_mbytes_per_sec": 0 00:12:25.809 }, 00:12:25.809 "claimed": true, 00:12:25.809 "claim_type": "exclusive_write", 00:12:25.809 "zoned": false, 00:12:25.809 "supported_io_types": { 00:12:25.809 "read": true, 00:12:25.809 "write": true, 00:12:25.809 "unmap": true, 00:12:25.809 "flush": true, 00:12:25.809 "reset": true, 00:12:25.809 "nvme_admin": false, 00:12:25.809 "nvme_io": false, 00:12:25.809 "nvme_io_md": false, 00:12:25.809 "write_zeroes": true, 00:12:25.809 "zcopy": true, 00:12:25.809 "get_zone_info": false, 00:12:25.809 "zone_management": false, 00:12:25.809 "zone_append": false, 00:12:25.809 "compare": false, 00:12:25.809 "compare_and_write": false, 00:12:25.809 "abort": true, 00:12:25.809 "seek_hole": false, 00:12:25.809 "seek_data": false, 00:12:25.809 "copy": true, 00:12:25.809 "nvme_iov_md": false 00:12:25.809 }, 00:12:25.809 "memory_domains": [ 00:12:25.809 { 00:12:25.809 "dma_device_id": "system", 00:12:25.809 "dma_device_type": 1 00:12:25.809 }, 00:12:25.809 { 00:12:25.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.809 "dma_device_type": 2 00:12:25.809 } 00:12:25.809 ], 00:12:25.809 "driver_specific": {} 00:12:25.809 }' 00:12:25.809 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.809 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.810 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:25.810 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.810 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:26.068 09:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:26.326 [2024-07-15 09:16:35.188127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.326 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.584 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.584 "name": "Existed_Raid", 00:12:26.584 "uuid": "c5c1be7b-4fbf-4f31-985b-53737274d32e", 00:12:26.584 "strip_size_kb": 0, 00:12:26.584 "state": "online", 00:12:26.584 "raid_level": "raid1", 00:12:26.584 "superblock": false, 00:12:26.584 "num_base_bdevs": 2, 00:12:26.584 "num_base_bdevs_discovered": 1, 00:12:26.584 "num_base_bdevs_operational": 1, 00:12:26.584 "base_bdevs_list": [ 00:12:26.584 { 00:12:26.584 "name": null, 00:12:26.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.584 "is_configured": false, 00:12:26.584 "data_offset": 0, 00:12:26.584 "data_size": 65536 00:12:26.584 }, 00:12:26.584 { 00:12:26.584 "name": "BaseBdev2", 00:12:26.584 "uuid": "5d61ffc6-26de-43f9-b56d-ec99b1e197ed", 00:12:26.584 "is_configured": true, 00:12:26.584 "data_offset": 0, 00:12:26.584 "data_size": 65536 00:12:26.584 } 00:12:26.584 ] 00:12:26.584 }' 00:12:26.584 09:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.584 09:16:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.149 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:27.149 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:27.149 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.149 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:27.407 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:27.407 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:27.407 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:27.664 [2024-07-15 09:16:36.512696] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:27.664 [2024-07-15 09:16:36.512781] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:27.664 [2024-07-15 09:16:36.525065] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:27.665 [2024-07-15 09:16:36.525102] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:27.665 [2024-07-15 09:16:36.525114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2540000 name Existed_Raid, state offline 00:12:27.665 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:27.665 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:27.665 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.665 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 96881 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 96881 ']' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 96881 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 96881 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 96881' 00:12:27.923 killing process with pid 96881 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 96881 00:12:27.923 [2024-07-15 09:16:36.841034] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:27.923 09:16:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 96881 00:12:27.923 [2024-07-15 09:16:36.842018] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:28.182 09:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:28.182 00:12:28.182 real 0m10.544s 00:12:28.182 user 0m18.772s 00:12:28.182 sys 0m1.922s 00:12:28.182 09:16:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:28.182 09:16:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.182 ************************************ 00:12:28.182 END TEST raid_state_function_test 00:12:28.182 ************************************ 00:12:28.182 09:16:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:28.182 09:16:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:28.182 09:16:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:28.182 09:16:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:28.182 09:16:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:28.441 ************************************ 00:12:28.441 START TEST raid_state_function_test_sb 00:12:28.441 ************************************ 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=98530 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 98530' 00:12:28.441 Process raid pid: 98530 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 98530 /var/tmp/spdk-raid.sock 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 98530 ']' 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:28.441 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:28.441 09:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.441 [2024-07-15 09:16:37.226315] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:28.441 [2024-07-15 09:16:37.226386] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:28.441 [2024-07-15 09:16:37.359644] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:28.700 [2024-07-15 09:16:37.462685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:28.700 [2024-07-15 09:16:37.527098] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:28.700 [2024-07-15 09:16:37.527136] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:29.267 09:16:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:29.267 09:16:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:29.267 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:29.525 [2024-07-15 09:16:38.387706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:29.525 [2024-07-15 09:16:38.387750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:29.525 [2024-07-15 09:16:38.387761] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:29.525 [2024-07-15 09:16:38.387774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.525 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.783 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.783 "name": "Existed_Raid", 00:12:29.783 "uuid": "49b7ca04-9869-4d94-b221-c540025d8335", 00:12:29.783 "strip_size_kb": 0, 00:12:29.783 "state": "configuring", 00:12:29.783 "raid_level": "raid1", 00:12:29.783 "superblock": true, 00:12:29.783 "num_base_bdevs": 2, 00:12:29.783 "num_base_bdevs_discovered": 0, 00:12:29.783 "num_base_bdevs_operational": 2, 00:12:29.783 "base_bdevs_list": [ 00:12:29.783 { 00:12:29.783 "name": "BaseBdev1", 00:12:29.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.783 "is_configured": false, 00:12:29.783 "data_offset": 0, 00:12:29.783 "data_size": 0 00:12:29.783 }, 00:12:29.783 { 00:12:29.783 "name": "BaseBdev2", 00:12:29.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.783 "is_configured": false, 00:12:29.783 "data_offset": 0, 00:12:29.783 "data_size": 0 00:12:29.783 } 00:12:29.783 ] 00:12:29.783 }' 00:12:29.783 09:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.783 09:16:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.350 09:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:30.607 [2024-07-15 09:16:39.466458] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:30.607 [2024-07-15 09:16:39.466495] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e84a80 name Existed_Raid, state configuring 00:12:30.607 09:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:30.865 [2024-07-15 09:16:39.711129] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:30.865 [2024-07-15 09:16:39.711170] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:30.865 [2024-07-15 09:16:39.711180] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:30.865 [2024-07-15 09:16:39.711192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:30.865 09:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:31.124 [2024-07-15 09:16:39.966918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.124 BaseBdev1 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:31.124 09:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:31.383 09:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:31.641 [ 00:12:31.641 { 00:12:31.641 "name": "BaseBdev1", 00:12:31.641 "aliases": [ 00:12:31.641 "da61939f-245e-4c17-a676-81a65728f7de" 00:12:31.641 ], 00:12:31.641 "product_name": "Malloc disk", 00:12:31.641 "block_size": 512, 00:12:31.641 "num_blocks": 65536, 00:12:31.641 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:31.641 "assigned_rate_limits": { 00:12:31.641 "rw_ios_per_sec": 0, 00:12:31.641 "rw_mbytes_per_sec": 0, 00:12:31.641 "r_mbytes_per_sec": 0, 00:12:31.641 "w_mbytes_per_sec": 0 00:12:31.641 }, 00:12:31.641 "claimed": true, 00:12:31.641 "claim_type": "exclusive_write", 00:12:31.641 "zoned": false, 00:12:31.641 "supported_io_types": { 00:12:31.641 "read": true, 00:12:31.641 "write": true, 00:12:31.641 "unmap": true, 00:12:31.641 "flush": true, 00:12:31.641 "reset": true, 00:12:31.641 "nvme_admin": false, 00:12:31.641 "nvme_io": false, 00:12:31.641 "nvme_io_md": false, 00:12:31.641 "write_zeroes": true, 00:12:31.641 "zcopy": true, 00:12:31.641 "get_zone_info": false, 00:12:31.641 "zone_management": false, 00:12:31.641 "zone_append": false, 00:12:31.641 "compare": false, 00:12:31.641 "compare_and_write": false, 00:12:31.641 "abort": true, 00:12:31.641 "seek_hole": false, 00:12:31.641 "seek_data": false, 00:12:31.641 "copy": true, 00:12:31.641 "nvme_iov_md": false 00:12:31.641 }, 00:12:31.642 "memory_domains": [ 00:12:31.642 { 00:12:31.642 "dma_device_id": "system", 00:12:31.642 "dma_device_type": 1 00:12:31.642 }, 00:12:31.642 { 00:12:31.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.642 "dma_device_type": 2 00:12:31.642 } 00:12:31.642 ], 00:12:31.642 "driver_specific": {} 00:12:31.642 } 00:12:31.642 ] 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.642 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.900 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.900 "name": "Existed_Raid", 00:12:31.900 "uuid": "789e6614-aa9e-4aed-8119-677c0ed9ed61", 00:12:31.900 "strip_size_kb": 0, 00:12:31.900 "state": "configuring", 00:12:31.900 "raid_level": "raid1", 00:12:31.900 "superblock": true, 00:12:31.900 "num_base_bdevs": 2, 00:12:31.900 "num_base_bdevs_discovered": 1, 00:12:31.900 "num_base_bdevs_operational": 2, 00:12:31.900 "base_bdevs_list": [ 00:12:31.900 { 00:12:31.900 "name": "BaseBdev1", 00:12:31.900 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:31.900 "is_configured": true, 00:12:31.900 "data_offset": 2048, 00:12:31.900 "data_size": 63488 00:12:31.900 }, 00:12:31.900 { 00:12:31.900 "name": "BaseBdev2", 00:12:31.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.900 "is_configured": false, 00:12:31.900 "data_offset": 0, 00:12:31.900 "data_size": 0 00:12:31.900 } 00:12:31.900 ] 00:12:31.900 }' 00:12:31.900 09:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.900 09:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.467 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:32.724 [2024-07-15 09:16:41.539097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:32.724 [2024-07-15 09:16:41.539145] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e84350 name Existed_Raid, state configuring 00:12:32.724 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:32.982 [2024-07-15 09:16:41.787789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:32.982 [2024-07-15 09:16:41.789292] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:32.982 [2024-07-15 09:16:41.789326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.982 09:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.240 09:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.240 "name": "Existed_Raid", 00:12:33.240 "uuid": "c824ac74-740a-43bf-b172-5249669d9c68", 00:12:33.240 "strip_size_kb": 0, 00:12:33.240 "state": "configuring", 00:12:33.240 "raid_level": "raid1", 00:12:33.240 "superblock": true, 00:12:33.240 "num_base_bdevs": 2, 00:12:33.240 "num_base_bdevs_discovered": 1, 00:12:33.240 "num_base_bdevs_operational": 2, 00:12:33.240 "base_bdevs_list": [ 00:12:33.240 { 00:12:33.240 "name": "BaseBdev1", 00:12:33.240 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:33.240 "is_configured": true, 00:12:33.240 "data_offset": 2048, 00:12:33.240 "data_size": 63488 00:12:33.240 }, 00:12:33.240 { 00:12:33.240 "name": "BaseBdev2", 00:12:33.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.240 "is_configured": false, 00:12:33.240 "data_offset": 0, 00:12:33.240 "data_size": 0 00:12:33.240 } 00:12:33.240 ] 00:12:33.240 }' 00:12:33.240 09:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.240 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.805 09:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:34.062 [2024-07-15 09:16:42.894064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:34.062 [2024-07-15 09:16:42.894222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e85000 00:12:34.062 [2024-07-15 09:16:42.894237] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:34.062 [2024-07-15 09:16:42.894413] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d9f0c0 00:12:34.062 [2024-07-15 09:16:42.894534] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e85000 00:12:34.062 [2024-07-15 09:16:42.894545] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e85000 00:12:34.062 [2024-07-15 09:16:42.894637] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:34.062 BaseBdev2 00:12:34.062 09:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:34.062 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:34.063 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:34.063 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:34.063 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:34.063 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:34.063 09:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.320 09:16:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:34.577 [ 00:12:34.577 { 00:12:34.577 "name": "BaseBdev2", 00:12:34.577 "aliases": [ 00:12:34.577 "c640ad62-67c6-40fc-aa71-5548faceb584" 00:12:34.577 ], 00:12:34.577 "product_name": "Malloc disk", 00:12:34.577 "block_size": 512, 00:12:34.577 "num_blocks": 65536, 00:12:34.577 "uuid": "c640ad62-67c6-40fc-aa71-5548faceb584", 00:12:34.577 "assigned_rate_limits": { 00:12:34.577 "rw_ios_per_sec": 0, 00:12:34.577 "rw_mbytes_per_sec": 0, 00:12:34.577 "r_mbytes_per_sec": 0, 00:12:34.577 "w_mbytes_per_sec": 0 00:12:34.577 }, 00:12:34.577 "claimed": true, 00:12:34.577 "claim_type": "exclusive_write", 00:12:34.577 "zoned": false, 00:12:34.577 "supported_io_types": { 00:12:34.577 "read": true, 00:12:34.577 "write": true, 00:12:34.577 "unmap": true, 00:12:34.577 "flush": true, 00:12:34.577 "reset": true, 00:12:34.577 "nvme_admin": false, 00:12:34.577 "nvme_io": false, 00:12:34.577 "nvme_io_md": false, 00:12:34.577 "write_zeroes": true, 00:12:34.577 "zcopy": true, 00:12:34.577 "get_zone_info": false, 00:12:34.577 "zone_management": false, 00:12:34.577 "zone_append": false, 00:12:34.577 "compare": false, 00:12:34.577 "compare_and_write": false, 00:12:34.577 "abort": true, 00:12:34.577 "seek_hole": false, 00:12:34.577 "seek_data": false, 00:12:34.577 "copy": true, 00:12:34.577 "nvme_iov_md": false 00:12:34.577 }, 00:12:34.577 "memory_domains": [ 00:12:34.577 { 00:12:34.577 "dma_device_id": "system", 00:12:34.577 "dma_device_type": 1 00:12:34.577 }, 00:12:34.577 { 00:12:34.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.577 "dma_device_type": 2 00:12:34.577 } 00:12:34.577 ], 00:12:34.577 "driver_specific": {} 00:12:34.577 } 00:12:34.577 ] 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.577 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.835 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.835 "name": "Existed_Raid", 00:12:34.835 "uuid": "c824ac74-740a-43bf-b172-5249669d9c68", 00:12:34.835 "strip_size_kb": 0, 00:12:34.835 "state": "online", 00:12:34.835 "raid_level": "raid1", 00:12:34.835 "superblock": true, 00:12:34.835 "num_base_bdevs": 2, 00:12:34.835 "num_base_bdevs_discovered": 2, 00:12:34.835 "num_base_bdevs_operational": 2, 00:12:34.835 "base_bdevs_list": [ 00:12:34.835 { 00:12:34.835 "name": "BaseBdev1", 00:12:34.835 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:34.835 "is_configured": true, 00:12:34.835 "data_offset": 2048, 00:12:34.835 "data_size": 63488 00:12:34.835 }, 00:12:34.835 { 00:12:34.835 "name": "BaseBdev2", 00:12:34.835 "uuid": "c640ad62-67c6-40fc-aa71-5548faceb584", 00:12:34.835 "is_configured": true, 00:12:34.835 "data_offset": 2048, 00:12:34.835 "data_size": 63488 00:12:34.835 } 00:12:34.835 ] 00:12:34.835 }' 00:12:34.835 09:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.835 09:16:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:35.441 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:35.699 [2024-07-15 09:16:44.406362] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:35.699 "name": "Existed_Raid", 00:12:35.699 "aliases": [ 00:12:35.699 "c824ac74-740a-43bf-b172-5249669d9c68" 00:12:35.699 ], 00:12:35.699 "product_name": "Raid Volume", 00:12:35.699 "block_size": 512, 00:12:35.699 "num_blocks": 63488, 00:12:35.699 "uuid": "c824ac74-740a-43bf-b172-5249669d9c68", 00:12:35.699 "assigned_rate_limits": { 00:12:35.699 "rw_ios_per_sec": 0, 00:12:35.699 "rw_mbytes_per_sec": 0, 00:12:35.699 "r_mbytes_per_sec": 0, 00:12:35.699 "w_mbytes_per_sec": 0 00:12:35.699 }, 00:12:35.699 "claimed": false, 00:12:35.699 "zoned": false, 00:12:35.699 "supported_io_types": { 00:12:35.699 "read": true, 00:12:35.699 "write": true, 00:12:35.699 "unmap": false, 00:12:35.699 "flush": false, 00:12:35.699 "reset": true, 00:12:35.699 "nvme_admin": false, 00:12:35.699 "nvme_io": false, 00:12:35.699 "nvme_io_md": false, 00:12:35.699 "write_zeroes": true, 00:12:35.699 "zcopy": false, 00:12:35.699 "get_zone_info": false, 00:12:35.699 "zone_management": false, 00:12:35.699 "zone_append": false, 00:12:35.699 "compare": false, 00:12:35.699 "compare_and_write": false, 00:12:35.699 "abort": false, 00:12:35.699 "seek_hole": false, 00:12:35.699 "seek_data": false, 00:12:35.699 "copy": false, 00:12:35.699 "nvme_iov_md": false 00:12:35.699 }, 00:12:35.699 "memory_domains": [ 00:12:35.699 { 00:12:35.699 "dma_device_id": "system", 00:12:35.699 "dma_device_type": 1 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.699 "dma_device_type": 2 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "dma_device_id": "system", 00:12:35.699 "dma_device_type": 1 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.699 "dma_device_type": 2 00:12:35.699 } 00:12:35.699 ], 00:12:35.699 "driver_specific": { 00:12:35.699 "raid": { 00:12:35.699 "uuid": "c824ac74-740a-43bf-b172-5249669d9c68", 00:12:35.699 "strip_size_kb": 0, 00:12:35.699 "state": "online", 00:12:35.699 "raid_level": "raid1", 00:12:35.699 "superblock": true, 00:12:35.699 "num_base_bdevs": 2, 00:12:35.699 "num_base_bdevs_discovered": 2, 00:12:35.699 "num_base_bdevs_operational": 2, 00:12:35.699 "base_bdevs_list": [ 00:12:35.699 { 00:12:35.699 "name": "BaseBdev1", 00:12:35.699 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:35.699 "is_configured": true, 00:12:35.699 "data_offset": 2048, 00:12:35.699 "data_size": 63488 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "name": "BaseBdev2", 00:12:35.699 "uuid": "c640ad62-67c6-40fc-aa71-5548faceb584", 00:12:35.699 "is_configured": true, 00:12:35.699 "data_offset": 2048, 00:12:35.699 "data_size": 63488 00:12:35.699 } 00:12:35.699 ] 00:12:35.699 } 00:12:35.699 } 00:12:35.699 }' 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:35.699 BaseBdev2' 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:35.699 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.958 "name": "BaseBdev1", 00:12:35.958 "aliases": [ 00:12:35.958 "da61939f-245e-4c17-a676-81a65728f7de" 00:12:35.958 ], 00:12:35.958 "product_name": "Malloc disk", 00:12:35.958 "block_size": 512, 00:12:35.958 "num_blocks": 65536, 00:12:35.958 "uuid": "da61939f-245e-4c17-a676-81a65728f7de", 00:12:35.958 "assigned_rate_limits": { 00:12:35.958 "rw_ios_per_sec": 0, 00:12:35.958 "rw_mbytes_per_sec": 0, 00:12:35.958 "r_mbytes_per_sec": 0, 00:12:35.958 "w_mbytes_per_sec": 0 00:12:35.958 }, 00:12:35.958 "claimed": true, 00:12:35.958 "claim_type": "exclusive_write", 00:12:35.958 "zoned": false, 00:12:35.958 "supported_io_types": { 00:12:35.958 "read": true, 00:12:35.958 "write": true, 00:12:35.958 "unmap": true, 00:12:35.958 "flush": true, 00:12:35.958 "reset": true, 00:12:35.958 "nvme_admin": false, 00:12:35.958 "nvme_io": false, 00:12:35.958 "nvme_io_md": false, 00:12:35.958 "write_zeroes": true, 00:12:35.958 "zcopy": true, 00:12:35.958 "get_zone_info": false, 00:12:35.958 "zone_management": false, 00:12:35.958 "zone_append": false, 00:12:35.958 "compare": false, 00:12:35.958 "compare_and_write": false, 00:12:35.958 "abort": true, 00:12:35.958 "seek_hole": false, 00:12:35.958 "seek_data": false, 00:12:35.958 "copy": true, 00:12:35.958 "nvme_iov_md": false 00:12:35.958 }, 00:12:35.958 "memory_domains": [ 00:12:35.958 { 00:12:35.958 "dma_device_id": "system", 00:12:35.958 "dma_device_type": 1 00:12:35.958 }, 00:12:35.958 { 00:12:35.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.958 "dma_device_type": 2 00:12:35.958 } 00:12:35.958 ], 00:12:35.958 "driver_specific": {} 00:12:35.958 }' 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.958 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.216 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.216 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:36.216 09:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.216 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.216 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:36.216 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:36.216 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:36.216 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:36.474 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:36.474 "name": "BaseBdev2", 00:12:36.474 "aliases": [ 00:12:36.474 "c640ad62-67c6-40fc-aa71-5548faceb584" 00:12:36.474 ], 00:12:36.474 "product_name": "Malloc disk", 00:12:36.474 "block_size": 512, 00:12:36.474 "num_blocks": 65536, 00:12:36.474 "uuid": "c640ad62-67c6-40fc-aa71-5548faceb584", 00:12:36.474 "assigned_rate_limits": { 00:12:36.474 "rw_ios_per_sec": 0, 00:12:36.474 "rw_mbytes_per_sec": 0, 00:12:36.474 "r_mbytes_per_sec": 0, 00:12:36.474 "w_mbytes_per_sec": 0 00:12:36.474 }, 00:12:36.474 "claimed": true, 00:12:36.474 "claim_type": "exclusive_write", 00:12:36.474 "zoned": false, 00:12:36.474 "supported_io_types": { 00:12:36.474 "read": true, 00:12:36.474 "write": true, 00:12:36.474 "unmap": true, 00:12:36.474 "flush": true, 00:12:36.474 "reset": true, 00:12:36.474 "nvme_admin": false, 00:12:36.474 "nvme_io": false, 00:12:36.474 "nvme_io_md": false, 00:12:36.474 "write_zeroes": true, 00:12:36.474 "zcopy": true, 00:12:36.474 "get_zone_info": false, 00:12:36.474 "zone_management": false, 00:12:36.474 "zone_append": false, 00:12:36.474 "compare": false, 00:12:36.474 "compare_and_write": false, 00:12:36.474 "abort": true, 00:12:36.474 "seek_hole": false, 00:12:36.474 "seek_data": false, 00:12:36.474 "copy": true, 00:12:36.474 "nvme_iov_md": false 00:12:36.474 }, 00:12:36.474 "memory_domains": [ 00:12:36.474 { 00:12:36.474 "dma_device_id": "system", 00:12:36.474 "dma_device_type": 1 00:12:36.474 }, 00:12:36.474 { 00:12:36.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.474 "dma_device_type": 2 00:12:36.474 } 00:12:36.474 ], 00:12:36.474 "driver_specific": {} 00:12:36.474 }' 00:12:36.474 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.474 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.474 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:36.474 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:36.732 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:36.990 [2024-07-15 09:16:45.853997] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.990 09:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.247 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.247 "name": "Existed_Raid", 00:12:37.247 "uuid": "c824ac74-740a-43bf-b172-5249669d9c68", 00:12:37.247 "strip_size_kb": 0, 00:12:37.247 "state": "online", 00:12:37.247 "raid_level": "raid1", 00:12:37.247 "superblock": true, 00:12:37.247 "num_base_bdevs": 2, 00:12:37.247 "num_base_bdevs_discovered": 1, 00:12:37.247 "num_base_bdevs_operational": 1, 00:12:37.247 "base_bdevs_list": [ 00:12:37.247 { 00:12:37.247 "name": null, 00:12:37.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.247 "is_configured": false, 00:12:37.247 "data_offset": 2048, 00:12:37.247 "data_size": 63488 00:12:37.247 }, 00:12:37.247 { 00:12:37.247 "name": "BaseBdev2", 00:12:37.247 "uuid": "c640ad62-67c6-40fc-aa71-5548faceb584", 00:12:37.247 "is_configured": true, 00:12:37.247 "data_offset": 2048, 00:12:37.247 "data_size": 63488 00:12:37.247 } 00:12:37.247 ] 00:12:37.247 }' 00:12:37.247 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.247 09:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:37.809 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:37.809 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:37.809 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.809 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:38.065 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:38.065 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:38.065 09:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:38.322 [2024-07-15 09:16:47.151413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:38.322 [2024-07-15 09:16:47.151511] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:38.322 [2024-07-15 09:16:47.164276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:38.322 [2024-07-15 09:16:47.164318] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:38.322 [2024-07-15 09:16:47.164338] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e85000 name Existed_Raid, state offline 00:12:38.322 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:38.322 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:38.322 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.322 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:38.580 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:38.580 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:38.580 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:38.580 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 98530 00:12:38.580 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 98530 ']' 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 98530 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 98530 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 98530' 00:12:38.581 killing process with pid 98530 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 98530 00:12:38.581 [2024-07-15 09:16:47.467179] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:38.581 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 98530 00:12:38.581 [2024-07-15 09:16:47.468189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:38.839 09:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:38.839 00:12:38.839 real 0m10.541s 00:12:38.839 user 0m18.779s 00:12:38.839 sys 0m1.959s 00:12:38.839 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:38.839 09:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.839 ************************************ 00:12:38.839 END TEST raid_state_function_test_sb 00:12:38.839 ************************************ 00:12:38.839 09:16:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:38.839 09:16:47 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:38.839 09:16:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:38.839 09:16:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:38.839 09:16:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:38.839 ************************************ 00:12:38.839 START TEST raid_superblock_test 00:12:38.839 ************************************ 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=100165 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 100165 /var/tmp/spdk-raid.sock 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 100165 ']' 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:38.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:38.839 09:16:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.097 [2024-07-15 09:16:47.839128] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:39.097 [2024-07-15 09:16:47.839195] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid100165 ] 00:12:39.097 [2024-07-15 09:16:47.968360] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:39.355 [2024-07-15 09:16:48.072182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:39.355 [2024-07-15 09:16:48.131156] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:39.355 [2024-07-15 09:16:48.131185] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:39.921 09:16:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:40.179 malloc1 00:12:40.179 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:40.437 [2024-07-15 09:16:49.251991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:40.437 [2024-07-15 09:16:49.252039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.437 [2024-07-15 09:16:49.252063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1329570 00:12:40.437 [2024-07-15 09:16:49.252076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.437 [2024-07-15 09:16:49.253682] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.437 [2024-07-15 09:16:49.253710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:40.437 pt1 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:40.437 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:40.695 malloc2 00:12:40.695 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:40.952 [2024-07-15 09:16:49.738047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:40.952 [2024-07-15 09:16:49.738095] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.952 [2024-07-15 09:16:49.738113] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132a970 00:12:40.952 [2024-07-15 09:16:49.738125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.952 [2024-07-15 09:16:49.739571] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.952 [2024-07-15 09:16:49.739598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:40.952 pt2 00:12:40.952 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:40.952 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:40.952 09:16:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:41.211 [2024-07-15 09:16:49.982711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:41.211 [2024-07-15 09:16:49.983882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:41.211 [2024-07-15 09:16:49.984031] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14cd270 00:12:41.211 [2024-07-15 09:16:49.984045] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:41.211 [2024-07-15 09:16:49.984225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13210e0 00:12:41.211 [2024-07-15 09:16:49.984365] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14cd270 00:12:41.211 [2024-07-15 09:16:49.984375] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14cd270 00:12:41.211 [2024-07-15 09:16:49.984467] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:41.211 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.469 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.469 "name": "raid_bdev1", 00:12:41.469 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:41.469 "strip_size_kb": 0, 00:12:41.469 "state": "online", 00:12:41.469 "raid_level": "raid1", 00:12:41.469 "superblock": true, 00:12:41.469 "num_base_bdevs": 2, 00:12:41.469 "num_base_bdevs_discovered": 2, 00:12:41.469 "num_base_bdevs_operational": 2, 00:12:41.469 "base_bdevs_list": [ 00:12:41.469 { 00:12:41.469 "name": "pt1", 00:12:41.469 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.469 "is_configured": true, 00:12:41.469 "data_offset": 2048, 00:12:41.469 "data_size": 63488 00:12:41.469 }, 00:12:41.469 { 00:12:41.469 "name": "pt2", 00:12:41.469 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.469 "is_configured": true, 00:12:41.469 "data_offset": 2048, 00:12:41.469 "data_size": 63488 00:12:41.469 } 00:12:41.469 ] 00:12:41.469 }' 00:12:41.469 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.469 09:16:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:42.034 09:16:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:42.291 [2024-07-15 09:16:51.034087] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.291 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:42.291 "name": "raid_bdev1", 00:12:42.291 "aliases": [ 00:12:42.291 "e592ab81-3934-4660-a928-e3bf88e5aca7" 00:12:42.291 ], 00:12:42.291 "product_name": "Raid Volume", 00:12:42.291 "block_size": 512, 00:12:42.291 "num_blocks": 63488, 00:12:42.291 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:42.291 "assigned_rate_limits": { 00:12:42.291 "rw_ios_per_sec": 0, 00:12:42.291 "rw_mbytes_per_sec": 0, 00:12:42.291 "r_mbytes_per_sec": 0, 00:12:42.291 "w_mbytes_per_sec": 0 00:12:42.291 }, 00:12:42.291 "claimed": false, 00:12:42.291 "zoned": false, 00:12:42.291 "supported_io_types": { 00:12:42.291 "read": true, 00:12:42.291 "write": true, 00:12:42.291 "unmap": false, 00:12:42.291 "flush": false, 00:12:42.291 "reset": true, 00:12:42.291 "nvme_admin": false, 00:12:42.291 "nvme_io": false, 00:12:42.291 "nvme_io_md": false, 00:12:42.291 "write_zeroes": true, 00:12:42.291 "zcopy": false, 00:12:42.291 "get_zone_info": false, 00:12:42.291 "zone_management": false, 00:12:42.291 "zone_append": false, 00:12:42.291 "compare": false, 00:12:42.291 "compare_and_write": false, 00:12:42.291 "abort": false, 00:12:42.291 "seek_hole": false, 00:12:42.291 "seek_data": false, 00:12:42.292 "copy": false, 00:12:42.292 "nvme_iov_md": false 00:12:42.292 }, 00:12:42.292 "memory_domains": [ 00:12:42.292 { 00:12:42.292 "dma_device_id": "system", 00:12:42.292 "dma_device_type": 1 00:12:42.292 }, 00:12:42.292 { 00:12:42.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.292 "dma_device_type": 2 00:12:42.292 }, 00:12:42.292 { 00:12:42.292 "dma_device_id": "system", 00:12:42.292 "dma_device_type": 1 00:12:42.292 }, 00:12:42.292 { 00:12:42.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.292 "dma_device_type": 2 00:12:42.292 } 00:12:42.292 ], 00:12:42.292 "driver_specific": { 00:12:42.292 "raid": { 00:12:42.292 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:42.292 "strip_size_kb": 0, 00:12:42.292 "state": "online", 00:12:42.292 "raid_level": "raid1", 00:12:42.292 "superblock": true, 00:12:42.292 "num_base_bdevs": 2, 00:12:42.292 "num_base_bdevs_discovered": 2, 00:12:42.292 "num_base_bdevs_operational": 2, 00:12:42.292 "base_bdevs_list": [ 00:12:42.292 { 00:12:42.292 "name": "pt1", 00:12:42.292 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.292 "is_configured": true, 00:12:42.292 "data_offset": 2048, 00:12:42.292 "data_size": 63488 00:12:42.292 }, 00:12:42.292 { 00:12:42.292 "name": "pt2", 00:12:42.292 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.292 "is_configured": true, 00:12:42.292 "data_offset": 2048, 00:12:42.292 "data_size": 63488 00:12:42.292 } 00:12:42.292 ] 00:12:42.292 } 00:12:42.292 } 00:12:42.292 }' 00:12:42.292 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:42.292 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:42.292 pt2' 00:12:42.292 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.292 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.292 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.549 "name": "pt1", 00:12:42.549 "aliases": [ 00:12:42.549 "00000000-0000-0000-0000-000000000001" 00:12:42.549 ], 00:12:42.549 "product_name": "passthru", 00:12:42.549 "block_size": 512, 00:12:42.549 "num_blocks": 65536, 00:12:42.549 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.549 "assigned_rate_limits": { 00:12:42.549 "rw_ios_per_sec": 0, 00:12:42.549 "rw_mbytes_per_sec": 0, 00:12:42.549 "r_mbytes_per_sec": 0, 00:12:42.549 "w_mbytes_per_sec": 0 00:12:42.549 }, 00:12:42.549 "claimed": true, 00:12:42.549 "claim_type": "exclusive_write", 00:12:42.549 "zoned": false, 00:12:42.549 "supported_io_types": { 00:12:42.549 "read": true, 00:12:42.549 "write": true, 00:12:42.549 "unmap": true, 00:12:42.549 "flush": true, 00:12:42.549 "reset": true, 00:12:42.549 "nvme_admin": false, 00:12:42.549 "nvme_io": false, 00:12:42.549 "nvme_io_md": false, 00:12:42.549 "write_zeroes": true, 00:12:42.549 "zcopy": true, 00:12:42.549 "get_zone_info": false, 00:12:42.549 "zone_management": false, 00:12:42.549 "zone_append": false, 00:12:42.549 "compare": false, 00:12:42.549 "compare_and_write": false, 00:12:42.549 "abort": true, 00:12:42.549 "seek_hole": false, 00:12:42.549 "seek_data": false, 00:12:42.549 "copy": true, 00:12:42.549 "nvme_iov_md": false 00:12:42.549 }, 00:12:42.549 "memory_domains": [ 00:12:42.549 { 00:12:42.549 "dma_device_id": "system", 00:12:42.549 "dma_device_type": 1 00:12:42.549 }, 00:12:42.549 { 00:12:42.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.549 "dma_device_type": 2 00:12:42.549 } 00:12:42.549 ], 00:12:42.549 "driver_specific": { 00:12:42.549 "passthru": { 00:12:42.549 "name": "pt1", 00:12:42.549 "base_bdev_name": "malloc1" 00:12:42.549 } 00:12:42.549 } 00:12:42.549 }' 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.549 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:42.807 09:16:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.370 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.370 "name": "pt2", 00:12:43.370 "aliases": [ 00:12:43.370 "00000000-0000-0000-0000-000000000002" 00:12:43.370 ], 00:12:43.370 "product_name": "passthru", 00:12:43.370 "block_size": 512, 00:12:43.370 "num_blocks": 65536, 00:12:43.370 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.370 "assigned_rate_limits": { 00:12:43.370 "rw_ios_per_sec": 0, 00:12:43.370 "rw_mbytes_per_sec": 0, 00:12:43.370 "r_mbytes_per_sec": 0, 00:12:43.370 "w_mbytes_per_sec": 0 00:12:43.370 }, 00:12:43.370 "claimed": true, 00:12:43.370 "claim_type": "exclusive_write", 00:12:43.370 "zoned": false, 00:12:43.370 "supported_io_types": { 00:12:43.370 "read": true, 00:12:43.370 "write": true, 00:12:43.370 "unmap": true, 00:12:43.370 "flush": true, 00:12:43.370 "reset": true, 00:12:43.370 "nvme_admin": false, 00:12:43.370 "nvme_io": false, 00:12:43.370 "nvme_io_md": false, 00:12:43.370 "write_zeroes": true, 00:12:43.370 "zcopy": true, 00:12:43.370 "get_zone_info": false, 00:12:43.370 "zone_management": false, 00:12:43.370 "zone_append": false, 00:12:43.370 "compare": false, 00:12:43.370 "compare_and_write": false, 00:12:43.370 "abort": true, 00:12:43.370 "seek_hole": false, 00:12:43.370 "seek_data": false, 00:12:43.370 "copy": true, 00:12:43.370 "nvme_iov_md": false 00:12:43.370 }, 00:12:43.370 "memory_domains": [ 00:12:43.370 { 00:12:43.370 "dma_device_id": "system", 00:12:43.370 "dma_device_type": 1 00:12:43.370 }, 00:12:43.370 { 00:12:43.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.370 "dma_device_type": 2 00:12:43.370 } 00:12:43.370 ], 00:12:43.370 "driver_specific": { 00:12:43.370 "passthru": { 00:12:43.370 "name": "pt2", 00:12:43.370 "base_bdev_name": "malloc2" 00:12:43.370 } 00:12:43.370 } 00:12:43.370 }' 00:12:43.370 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.370 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.370 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.370 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:43.627 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.884 [2024-07-15 09:16:52.806795] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.884 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e592ab81-3934-4660-a928-e3bf88e5aca7 00:12:43.884 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e592ab81-3934-4660-a928-e3bf88e5aca7 ']' 00:12:43.884 09:16:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:44.449 [2024-07-15 09:16:53.311872] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:44.449 [2024-07-15 09:16:53.311902] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:44.449 [2024-07-15 09:16:53.311974] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:44.449 [2024-07-15 09:16:53.312035] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:44.449 [2024-07-15 09:16:53.312048] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cd270 name raid_bdev1, state offline 00:12:44.449 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.449 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:44.706 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:44.706 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:44.706 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:44.706 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:44.963 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:44.963 09:16:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:45.220 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:45.220 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:45.478 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:45.735 [2024-07-15 09:16:54.474882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:45.735 [2024-07-15 09:16:54.476290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:45.735 [2024-07-15 09:16:54.476349] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:45.735 [2024-07-15 09:16:54.476392] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:45.735 [2024-07-15 09:16:54.476411] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:45.735 [2024-07-15 09:16:54.476421] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14ccff0 name raid_bdev1, state configuring 00:12:45.735 request: 00:12:45.735 { 00:12:45.735 "name": "raid_bdev1", 00:12:45.735 "raid_level": "raid1", 00:12:45.735 "base_bdevs": [ 00:12:45.735 "malloc1", 00:12:45.735 "malloc2" 00:12:45.735 ], 00:12:45.735 "superblock": false, 00:12:45.736 "method": "bdev_raid_create", 00:12:45.736 "req_id": 1 00:12:45.736 } 00:12:45.736 Got JSON-RPC error response 00:12:45.736 response: 00:12:45.736 { 00:12:45.736 "code": -17, 00:12:45.736 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:45.736 } 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.736 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:45.994 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:45.994 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:45.994 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:46.253 [2024-07-15 09:16:54.956106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:46.253 [2024-07-15 09:16:54.956161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.253 [2024-07-15 09:16:54.956184] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13297a0 00:12:46.253 [2024-07-15 09:16:54.956197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.253 [2024-07-15 09:16:54.957875] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.253 [2024-07-15 09:16:54.957905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:46.253 [2024-07-15 09:16:54.957983] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:46.253 [2024-07-15 09:16:54.958016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:46.253 pt1 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.253 09:16:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.512 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.512 "name": "raid_bdev1", 00:12:46.512 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:46.512 "strip_size_kb": 0, 00:12:46.512 "state": "configuring", 00:12:46.512 "raid_level": "raid1", 00:12:46.512 "superblock": true, 00:12:46.512 "num_base_bdevs": 2, 00:12:46.512 "num_base_bdevs_discovered": 1, 00:12:46.512 "num_base_bdevs_operational": 2, 00:12:46.512 "base_bdevs_list": [ 00:12:46.512 { 00:12:46.512 "name": "pt1", 00:12:46.512 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:46.512 "is_configured": true, 00:12:46.512 "data_offset": 2048, 00:12:46.512 "data_size": 63488 00:12:46.512 }, 00:12:46.512 { 00:12:46.512 "name": null, 00:12:46.512 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.512 "is_configured": false, 00:12:46.512 "data_offset": 2048, 00:12:46.512 "data_size": 63488 00:12:46.512 } 00:12:46.512 ] 00:12:46.512 }' 00:12:46.512 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.512 09:16:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.078 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:47.078 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:47.078 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:47.078 09:16:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:47.078 [2024-07-15 09:16:56.014923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:47.078 [2024-07-15 09:16:56.014985] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:47.078 [2024-07-15 09:16:56.015005] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c16f0 00:12:47.078 [2024-07-15 09:16:56.015017] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:47.078 [2024-07-15 09:16:56.015379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:47.078 [2024-07-15 09:16:56.015396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:47.078 [2024-07-15 09:16:56.015459] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:47.078 [2024-07-15 09:16:56.015481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:47.078 [2024-07-15 09:16:56.015584] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c2590 00:12:47.078 [2024-07-15 09:16:56.015595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:47.078 [2024-07-15 09:16:56.015764] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1323540 00:12:47.078 [2024-07-15 09:16:56.015888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c2590 00:12:47.078 [2024-07-15 09:16:56.015898] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14c2590 00:12:47.078 [2024-07-15 09:16:56.016005] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:47.078 pt2 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.336 "name": "raid_bdev1", 00:12:47.336 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:47.336 "strip_size_kb": 0, 00:12:47.336 "state": "online", 00:12:47.336 "raid_level": "raid1", 00:12:47.336 "superblock": true, 00:12:47.336 "num_base_bdevs": 2, 00:12:47.336 "num_base_bdevs_discovered": 2, 00:12:47.336 "num_base_bdevs_operational": 2, 00:12:47.336 "base_bdevs_list": [ 00:12:47.336 { 00:12:47.336 "name": "pt1", 00:12:47.336 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.336 "is_configured": true, 00:12:47.336 "data_offset": 2048, 00:12:47.336 "data_size": 63488 00:12:47.336 }, 00:12:47.336 { 00:12:47.336 "name": "pt2", 00:12:47.336 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.336 "is_configured": true, 00:12:47.336 "data_offset": 2048, 00:12:47.336 "data_size": 63488 00:12:47.336 } 00:12:47.336 ] 00:12:47.336 }' 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.336 09:16:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:47.903 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:48.162 [2024-07-15 09:16:56.981750] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:48.162 09:16:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:48.162 "name": "raid_bdev1", 00:12:48.162 "aliases": [ 00:12:48.162 "e592ab81-3934-4660-a928-e3bf88e5aca7" 00:12:48.162 ], 00:12:48.162 "product_name": "Raid Volume", 00:12:48.162 "block_size": 512, 00:12:48.162 "num_blocks": 63488, 00:12:48.162 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:48.162 "assigned_rate_limits": { 00:12:48.162 "rw_ios_per_sec": 0, 00:12:48.162 "rw_mbytes_per_sec": 0, 00:12:48.162 "r_mbytes_per_sec": 0, 00:12:48.162 "w_mbytes_per_sec": 0 00:12:48.162 }, 00:12:48.162 "claimed": false, 00:12:48.162 "zoned": false, 00:12:48.162 "supported_io_types": { 00:12:48.162 "read": true, 00:12:48.162 "write": true, 00:12:48.162 "unmap": false, 00:12:48.162 "flush": false, 00:12:48.162 "reset": true, 00:12:48.162 "nvme_admin": false, 00:12:48.162 "nvme_io": false, 00:12:48.162 "nvme_io_md": false, 00:12:48.162 "write_zeroes": true, 00:12:48.162 "zcopy": false, 00:12:48.162 "get_zone_info": false, 00:12:48.162 "zone_management": false, 00:12:48.162 "zone_append": false, 00:12:48.162 "compare": false, 00:12:48.162 "compare_and_write": false, 00:12:48.162 "abort": false, 00:12:48.162 "seek_hole": false, 00:12:48.162 "seek_data": false, 00:12:48.162 "copy": false, 00:12:48.162 "nvme_iov_md": false 00:12:48.162 }, 00:12:48.162 "memory_domains": [ 00:12:48.162 { 00:12:48.162 "dma_device_id": "system", 00:12:48.162 "dma_device_type": 1 00:12:48.162 }, 00:12:48.162 { 00:12:48.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.162 "dma_device_type": 2 00:12:48.162 }, 00:12:48.162 { 00:12:48.162 "dma_device_id": "system", 00:12:48.162 "dma_device_type": 1 00:12:48.162 }, 00:12:48.162 { 00:12:48.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.162 "dma_device_type": 2 00:12:48.162 } 00:12:48.162 ], 00:12:48.162 "driver_specific": { 00:12:48.162 "raid": { 00:12:48.162 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:48.162 "strip_size_kb": 0, 00:12:48.162 "state": "online", 00:12:48.162 "raid_level": "raid1", 00:12:48.162 "superblock": true, 00:12:48.162 "num_base_bdevs": 2, 00:12:48.162 "num_base_bdevs_discovered": 2, 00:12:48.162 "num_base_bdevs_operational": 2, 00:12:48.162 "base_bdevs_list": [ 00:12:48.162 { 00:12:48.162 "name": "pt1", 00:12:48.162 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:48.162 "is_configured": true, 00:12:48.162 "data_offset": 2048, 00:12:48.162 "data_size": 63488 00:12:48.162 }, 00:12:48.162 { 00:12:48.162 "name": "pt2", 00:12:48.162 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.162 "is_configured": true, 00:12:48.162 "data_offset": 2048, 00:12:48.162 "data_size": 63488 00:12:48.162 } 00:12:48.162 ] 00:12:48.162 } 00:12:48.162 } 00:12:48.162 }' 00:12:48.162 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:48.162 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:48.162 pt2' 00:12:48.162 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.162 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:48.162 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.420 "name": "pt1", 00:12:48.420 "aliases": [ 00:12:48.420 "00000000-0000-0000-0000-000000000001" 00:12:48.420 ], 00:12:48.420 "product_name": "passthru", 00:12:48.420 "block_size": 512, 00:12:48.420 "num_blocks": 65536, 00:12:48.420 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:48.420 "assigned_rate_limits": { 00:12:48.420 "rw_ios_per_sec": 0, 00:12:48.420 "rw_mbytes_per_sec": 0, 00:12:48.420 "r_mbytes_per_sec": 0, 00:12:48.420 "w_mbytes_per_sec": 0 00:12:48.420 }, 00:12:48.420 "claimed": true, 00:12:48.420 "claim_type": "exclusive_write", 00:12:48.420 "zoned": false, 00:12:48.420 "supported_io_types": { 00:12:48.420 "read": true, 00:12:48.420 "write": true, 00:12:48.420 "unmap": true, 00:12:48.420 "flush": true, 00:12:48.420 "reset": true, 00:12:48.420 "nvme_admin": false, 00:12:48.420 "nvme_io": false, 00:12:48.420 "nvme_io_md": false, 00:12:48.420 "write_zeroes": true, 00:12:48.420 "zcopy": true, 00:12:48.420 "get_zone_info": false, 00:12:48.420 "zone_management": false, 00:12:48.420 "zone_append": false, 00:12:48.420 "compare": false, 00:12:48.420 "compare_and_write": false, 00:12:48.420 "abort": true, 00:12:48.420 "seek_hole": false, 00:12:48.420 "seek_data": false, 00:12:48.420 "copy": true, 00:12:48.420 "nvme_iov_md": false 00:12:48.420 }, 00:12:48.420 "memory_domains": [ 00:12:48.420 { 00:12:48.420 "dma_device_id": "system", 00:12:48.420 "dma_device_type": 1 00:12:48.420 }, 00:12:48.420 { 00:12:48.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.420 "dma_device_type": 2 00:12:48.420 } 00:12:48.420 ], 00:12:48.420 "driver_specific": { 00:12:48.420 "passthru": { 00:12:48.420 "name": "pt1", 00:12:48.420 "base_bdev_name": "malloc1" 00:12:48.420 } 00:12:48.420 } 00:12:48.420 }' 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.420 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:48.679 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.937 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.937 "name": "pt2", 00:12:48.937 "aliases": [ 00:12:48.937 "00000000-0000-0000-0000-000000000002" 00:12:48.937 ], 00:12:48.937 "product_name": "passthru", 00:12:48.937 "block_size": 512, 00:12:48.937 "num_blocks": 65536, 00:12:48.937 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.937 "assigned_rate_limits": { 00:12:48.937 "rw_ios_per_sec": 0, 00:12:48.937 "rw_mbytes_per_sec": 0, 00:12:48.937 "r_mbytes_per_sec": 0, 00:12:48.937 "w_mbytes_per_sec": 0 00:12:48.937 }, 00:12:48.937 "claimed": true, 00:12:48.937 "claim_type": "exclusive_write", 00:12:48.937 "zoned": false, 00:12:48.937 "supported_io_types": { 00:12:48.937 "read": true, 00:12:48.937 "write": true, 00:12:48.937 "unmap": true, 00:12:48.937 "flush": true, 00:12:48.937 "reset": true, 00:12:48.937 "nvme_admin": false, 00:12:48.937 "nvme_io": false, 00:12:48.937 "nvme_io_md": false, 00:12:48.937 "write_zeroes": true, 00:12:48.937 "zcopy": true, 00:12:48.937 "get_zone_info": false, 00:12:48.937 "zone_management": false, 00:12:48.937 "zone_append": false, 00:12:48.937 "compare": false, 00:12:48.937 "compare_and_write": false, 00:12:48.937 "abort": true, 00:12:48.937 "seek_hole": false, 00:12:48.937 "seek_data": false, 00:12:48.937 "copy": true, 00:12:48.937 "nvme_iov_md": false 00:12:48.937 }, 00:12:48.937 "memory_domains": [ 00:12:48.937 { 00:12:48.937 "dma_device_id": "system", 00:12:48.937 "dma_device_type": 1 00:12:48.937 }, 00:12:48.937 { 00:12:48.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.937 "dma_device_type": 2 00:12:48.937 } 00:12:48.937 ], 00:12:48.937 "driver_specific": { 00:12:48.937 "passthru": { 00:12:48.937 "name": "pt2", 00:12:48.937 "base_bdev_name": "malloc2" 00:12:48.937 } 00:12:48.938 } 00:12:48.938 }' 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.938 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.196 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.196 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:49.196 09:16:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.196 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.196 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.196 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:49.196 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:49.455 [2024-07-15 09:16:58.204977] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.455 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e592ab81-3934-4660-a928-e3bf88e5aca7 '!=' e592ab81-3934-4660-a928-e3bf88e5aca7 ']' 00:12:49.455 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:49.455 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:49.455 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:49.455 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:49.714 [2024-07-15 09:16:58.445383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.714 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:49.972 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.972 "name": "raid_bdev1", 00:12:49.972 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:49.972 "strip_size_kb": 0, 00:12:49.972 "state": "online", 00:12:49.972 "raid_level": "raid1", 00:12:49.972 "superblock": true, 00:12:49.972 "num_base_bdevs": 2, 00:12:49.972 "num_base_bdevs_discovered": 1, 00:12:49.972 "num_base_bdevs_operational": 1, 00:12:49.972 "base_bdevs_list": [ 00:12:49.972 { 00:12:49.972 "name": null, 00:12:49.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.972 "is_configured": false, 00:12:49.972 "data_offset": 2048, 00:12:49.972 "data_size": 63488 00:12:49.972 }, 00:12:49.972 { 00:12:49.972 "name": "pt2", 00:12:49.972 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:49.972 "is_configured": true, 00:12:49.972 "data_offset": 2048, 00:12:49.972 "data_size": 63488 00:12:49.972 } 00:12:49.972 ] 00:12:49.972 }' 00:12:49.972 09:16:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.972 09:16:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.906 09:16:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:50.906 [2024-07-15 09:16:59.805015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:50.906 [2024-07-15 09:16:59.805052] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:50.906 [2024-07-15 09:16:59.805113] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:50.906 [2024-07-15 09:16:59.805158] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:50.906 [2024-07-15 09:16:59.805169] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c2590 name raid_bdev1, state offline 00:12:50.906 09:16:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.906 09:16:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:51.164 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:51.164 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:51.164 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:51.164 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:51.164 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:51.422 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:51.682 [2024-07-15 09:17:00.486771] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:51.682 [2024-07-15 09:17:00.486822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.682 [2024-07-15 09:17:00.486840] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132a160 00:12:51.682 [2024-07-15 09:17:00.486852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.682 [2024-07-15 09:17:00.488464] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.682 [2024-07-15 09:17:00.488492] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:51.682 [2024-07-15 09:17:00.488557] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:51.682 [2024-07-15 09:17:00.488585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:51.682 [2024-07-15 09:17:00.488671] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1320380 00:12:51.682 [2024-07-15 09:17:00.488682] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:51.682 [2024-07-15 09:17:00.488848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1321a80 00:12:51.682 [2024-07-15 09:17:00.488977] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1320380 00:12:51.682 [2024-07-15 09:17:00.488988] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1320380 00:12:51.682 [2024-07-15 09:17:00.489083] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.682 pt2 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.682 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.998 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.998 "name": "raid_bdev1", 00:12:51.998 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:51.998 "strip_size_kb": 0, 00:12:51.998 "state": "online", 00:12:51.998 "raid_level": "raid1", 00:12:51.998 "superblock": true, 00:12:51.998 "num_base_bdevs": 2, 00:12:51.998 "num_base_bdevs_discovered": 1, 00:12:51.998 "num_base_bdevs_operational": 1, 00:12:51.998 "base_bdevs_list": [ 00:12:51.998 { 00:12:51.998 "name": null, 00:12:51.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.998 "is_configured": false, 00:12:51.998 "data_offset": 2048, 00:12:51.998 "data_size": 63488 00:12:51.998 }, 00:12:51.998 { 00:12:51.998 "name": "pt2", 00:12:51.998 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:51.998 "is_configured": true, 00:12:51.998 "data_offset": 2048, 00:12:51.998 "data_size": 63488 00:12:51.998 } 00:12:51.998 ] 00:12:51.998 }' 00:12:51.998 09:17:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.998 09:17:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.564 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:52.822 [2024-07-15 09:17:01.633807] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:52.822 [2024-07-15 09:17:01.633836] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:52.822 [2024-07-15 09:17:01.633893] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:52.822 [2024-07-15 09:17:01.633942] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:52.822 [2024-07-15 09:17:01.633955] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1320380 name raid_bdev1, state offline 00:12:52.822 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.822 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:53.080 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:53.080 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:53.080 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:53.080 09:17:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:53.338 [2024-07-15 09:17:02.115071] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:53.338 [2024-07-15 09:17:02.115121] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:53.338 [2024-07-15 09:17:02.115140] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cc520 00:12:53.338 [2024-07-15 09:17:02.115152] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:53.338 [2024-07-15 09:17:02.116744] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:53.338 [2024-07-15 09:17:02.116772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:53.338 [2024-07-15 09:17:02.116835] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:53.338 [2024-07-15 09:17:02.116862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:53.338 [2024-07-15 09:17:02.116968] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:53.338 [2024-07-15 09:17:02.116988] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:53.338 [2024-07-15 09:17:02.117002] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13213f0 name raid_bdev1, state configuring 00:12:53.338 [2024-07-15 09:17:02.117025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:53.338 [2024-07-15 09:17:02.117082] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13232b0 00:12:53.338 [2024-07-15 09:17:02.117092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:53.338 [2024-07-15 09:17:02.117251] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1320350 00:12:53.338 [2024-07-15 09:17:02.117370] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13232b0 00:12:53.338 [2024-07-15 09:17:02.117380] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13232b0 00:12:53.338 [2024-07-15 09:17:02.117474] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:53.338 pt1 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.338 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:53.596 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.596 "name": "raid_bdev1", 00:12:53.596 "uuid": "e592ab81-3934-4660-a928-e3bf88e5aca7", 00:12:53.596 "strip_size_kb": 0, 00:12:53.596 "state": "online", 00:12:53.596 "raid_level": "raid1", 00:12:53.596 "superblock": true, 00:12:53.596 "num_base_bdevs": 2, 00:12:53.596 "num_base_bdevs_discovered": 1, 00:12:53.596 "num_base_bdevs_operational": 1, 00:12:53.596 "base_bdevs_list": [ 00:12:53.596 { 00:12:53.596 "name": null, 00:12:53.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.596 "is_configured": false, 00:12:53.596 "data_offset": 2048, 00:12:53.596 "data_size": 63488 00:12:53.596 }, 00:12:53.596 { 00:12:53.596 "name": "pt2", 00:12:53.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.596 "is_configured": true, 00:12:53.596 "data_offset": 2048, 00:12:53.596 "data_size": 63488 00:12:53.596 } 00:12:53.596 ] 00:12:53.596 }' 00:12:53.596 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.596 09:17:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.162 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:54.162 09:17:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:54.433 [2024-07-15 09:17:03.354589] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' e592ab81-3934-4660-a928-e3bf88e5aca7 '!=' e592ab81-3934-4660-a928-e3bf88e5aca7 ']' 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 100165 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 100165 ']' 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 100165 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:54.433 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 100165 00:12:54.695 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:54.695 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:54.695 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 100165' 00:12:54.695 killing process with pid 100165 00:12:54.695 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 100165 00:12:54.695 [2024-07-15 09:17:03.422462] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.695 [2024-07-15 09:17:03.422531] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.695 [2024-07-15 09:17:03.422578] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.695 [2024-07-15 09:17:03.422591] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13232b0 name raid_bdev1, state offline 00:12:54.695 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 100165 00:12:54.695 [2024-07-15 09:17:03.441876] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:54.955 09:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:54.955 00:12:54.955 real 0m15.889s 00:12:54.955 user 0m28.875s 00:12:54.955 sys 0m2.840s 00:12:54.955 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:54.955 09:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.955 ************************************ 00:12:54.955 END TEST raid_superblock_test 00:12:54.955 ************************************ 00:12:54.955 09:17:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:54.955 09:17:03 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:54.955 09:17:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:54.955 09:17:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:54.955 09:17:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:54.955 ************************************ 00:12:54.955 START TEST raid_read_error_test 00:12:54.955 ************************************ 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:54.955 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.kIgFoF2UiZ 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=102600 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 102600 /var/tmp/spdk-raid.sock 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 102600 ']' 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:54.956 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:54.956 09:17:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.956 [2024-07-15 09:17:03.826598] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:12:54.956 [2024-07-15 09:17:03.826664] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid102600 ] 00:12:55.214 [2024-07-15 09:17:03.954850] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.214 [2024-07-15 09:17:04.059054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.214 [2024-07-15 09:17:04.113410] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.214 [2024-07-15 09:17:04.113445] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.150 09:17:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:56.150 09:17:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:56.150 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:56.150 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:56.409 BaseBdev1_malloc 00:12:56.409 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:56.667 true 00:12:56.667 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:56.925 [2024-07-15 09:17:05.709497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:56.925 [2024-07-15 09:17:05.709539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.925 [2024-07-15 09:17:05.709564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18740d0 00:12:56.925 [2024-07-15 09:17:05.709582] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.925 [2024-07-15 09:17:05.711427] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.925 [2024-07-15 09:17:05.711456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:56.925 BaseBdev1 00:12:56.925 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:56.925 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:57.183 BaseBdev2_malloc 00:12:57.183 09:17:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:57.183 true 00:12:57.183 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:57.441 [2024-07-15 09:17:06.272827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:57.441 [2024-07-15 09:17:06.272870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:57.441 [2024-07-15 09:17:06.272896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1878910 00:12:57.441 [2024-07-15 09:17:06.272908] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:57.441 [2024-07-15 09:17:06.274525] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:57.441 [2024-07-15 09:17:06.274553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:57.441 BaseBdev2 00:12:57.441 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:57.700 [2024-07-15 09:17:06.437285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.700 [2024-07-15 09:17:06.438663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:57.700 [2024-07-15 09:17:06.438856] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x187a320 00:12:57.700 [2024-07-15 09:17:06.438869] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:57.700 [2024-07-15 09:17:06.439077] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e1d00 00:12:57.700 [2024-07-15 09:17:06.439228] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x187a320 00:12:57.700 [2024-07-15 09:17:06.439238] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x187a320 00:12:57.700 [2024-07-15 09:17:06.439345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.700 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:57.958 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.958 "name": "raid_bdev1", 00:12:57.958 "uuid": "d6a03300-9475-48c3-9dd1-14da73b8d120", 00:12:57.958 "strip_size_kb": 0, 00:12:57.958 "state": "online", 00:12:57.958 "raid_level": "raid1", 00:12:57.958 "superblock": true, 00:12:57.958 "num_base_bdevs": 2, 00:12:57.958 "num_base_bdevs_discovered": 2, 00:12:57.958 "num_base_bdevs_operational": 2, 00:12:57.958 "base_bdevs_list": [ 00:12:57.958 { 00:12:57.958 "name": "BaseBdev1", 00:12:57.958 "uuid": "0dfbb479-33c2-5539-9e99-66e1f5236a8b", 00:12:57.958 "is_configured": true, 00:12:57.958 "data_offset": 2048, 00:12:57.958 "data_size": 63488 00:12:57.958 }, 00:12:57.958 { 00:12:57.958 "name": "BaseBdev2", 00:12:57.958 "uuid": "bfdec03a-feee-58bb-a850-fa0c7cb1ff6d", 00:12:57.958 "is_configured": true, 00:12:57.958 "data_offset": 2048, 00:12:57.958 "data_size": 63488 00:12:57.958 } 00:12:57.958 ] 00:12:57.958 }' 00:12:57.958 09:17:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.958 09:17:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.525 09:17:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:58.525 09:17:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:58.525 [2024-07-15 09:17:07.384096] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1875c70 00:12:59.459 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:59.718 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.719 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.978 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.978 "name": "raid_bdev1", 00:12:59.978 "uuid": "d6a03300-9475-48c3-9dd1-14da73b8d120", 00:12:59.978 "strip_size_kb": 0, 00:12:59.978 "state": "online", 00:12:59.978 "raid_level": "raid1", 00:12:59.978 "superblock": true, 00:12:59.978 "num_base_bdevs": 2, 00:12:59.978 "num_base_bdevs_discovered": 2, 00:12:59.978 "num_base_bdevs_operational": 2, 00:12:59.978 "base_bdevs_list": [ 00:12:59.978 { 00:12:59.978 "name": "BaseBdev1", 00:12:59.978 "uuid": "0dfbb479-33c2-5539-9e99-66e1f5236a8b", 00:12:59.978 "is_configured": true, 00:12:59.978 "data_offset": 2048, 00:12:59.978 "data_size": 63488 00:12:59.978 }, 00:12:59.978 { 00:12:59.978 "name": "BaseBdev2", 00:12:59.978 "uuid": "bfdec03a-feee-58bb-a850-fa0c7cb1ff6d", 00:12:59.978 "is_configured": true, 00:12:59.978 "data_offset": 2048, 00:12:59.978 "data_size": 63488 00:12:59.978 } 00:12:59.978 ] 00:12:59.978 }' 00:12:59.978 09:17:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.978 09:17:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.545 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:00.802 [2024-07-15 09:17:09.607567] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:00.802 [2024-07-15 09:17:09.607603] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:00.802 [2024-07-15 09:17:09.610781] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:00.802 [2024-07-15 09:17:09.610812] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.802 [2024-07-15 09:17:09.610890] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:00.802 [2024-07-15 09:17:09.610902] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187a320 name raid_bdev1, state offline 00:13:00.802 0 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 102600 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 102600 ']' 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 102600 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 102600 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 102600' 00:13:00.802 killing process with pid 102600 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 102600 00:13:00.802 [2024-07-15 09:17:09.671104] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:00.802 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 102600 00:13:00.802 [2024-07-15 09:17:09.681477] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.kIgFoF2UiZ 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:01.060 00:13:01.060 real 0m6.154s 00:13:01.060 user 0m9.615s 00:13:01.060 sys 0m1.086s 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:01.060 09:17:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.060 ************************************ 00:13:01.060 END TEST raid_read_error_test 00:13:01.060 ************************************ 00:13:01.060 09:17:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:01.060 09:17:09 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:01.060 09:17:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:01.060 09:17:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:01.060 09:17:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:01.060 ************************************ 00:13:01.060 START TEST raid_write_error_test 00:13:01.060 ************************************ 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.060 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:01.061 09:17:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qKHwfVO8gc 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=103416 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 103416 /var/tmp/spdk-raid.sock 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 103416 ']' 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:01.061 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:01.061 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.319 [2024-07-15 09:17:10.069117] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:13:01.320 [2024-07-15 09:17:10.069190] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid103416 ] 00:13:01.320 [2024-07-15 09:17:10.201204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:01.579 [2024-07-15 09:17:10.304798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.579 [2024-07-15 09:17:10.366407] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.579 [2024-07-15 09:17:10.366439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.155 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:02.155 09:17:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:02.155 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:02.155 09:17:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:02.416 BaseBdev1_malloc 00:13:02.416 09:17:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:02.674 true 00:13:02.674 09:17:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:02.931 [2024-07-15 09:17:11.655823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:02.931 [2024-07-15 09:17:11.655871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.931 [2024-07-15 09:17:11.655891] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f50d0 00:13:02.931 [2024-07-15 09:17:11.655904] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.931 [2024-07-15 09:17:11.657746] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.931 [2024-07-15 09:17:11.657775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:02.931 BaseBdev1 00:13:02.931 09:17:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:02.931 09:17:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:03.189 BaseBdev2_malloc 00:13:03.189 09:17:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:03.447 true 00:13:03.447 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:03.705 [2024-07-15 09:17:12.402464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:03.705 [2024-07-15 09:17:12.402509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.705 [2024-07-15 09:17:12.402528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f9910 00:13:03.705 [2024-07-15 09:17:12.402541] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.705 [2024-07-15 09:17:12.403972] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.705 [2024-07-15 09:17:12.403999] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:03.705 BaseBdev2 00:13:03.705 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:03.705 [2024-07-15 09:17:12.647133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.705 [2024-07-15 09:17:12.648410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:03.705 [2024-07-15 09:17:12.648602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20fb320 00:13:03.705 [2024-07-15 09:17:12.648615] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:03.705 [2024-07-15 09:17:12.648807] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f62d00 00:13:03.705 [2024-07-15 09:17:12.648966] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20fb320 00:13:03.705 [2024-07-15 09:17:12.648977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20fb320 00:13:03.705 [2024-07-15 09:17:12.649083] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.963 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:04.227 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.227 "name": "raid_bdev1", 00:13:04.227 "uuid": "0849971c-15fd-44d7-ab33-fdb772cfa186", 00:13:04.227 "strip_size_kb": 0, 00:13:04.227 "state": "online", 00:13:04.227 "raid_level": "raid1", 00:13:04.227 "superblock": true, 00:13:04.227 "num_base_bdevs": 2, 00:13:04.227 "num_base_bdevs_discovered": 2, 00:13:04.227 "num_base_bdevs_operational": 2, 00:13:04.227 "base_bdevs_list": [ 00:13:04.227 { 00:13:04.227 "name": "BaseBdev1", 00:13:04.227 "uuid": "56073968-aeb3-5f4c-aa7c-c816bf069f4a", 00:13:04.227 "is_configured": true, 00:13:04.227 "data_offset": 2048, 00:13:04.227 "data_size": 63488 00:13:04.227 }, 00:13:04.227 { 00:13:04.227 "name": "BaseBdev2", 00:13:04.227 "uuid": "5b6c5069-2b5a-56a1-9acf-e7177fd56ff4", 00:13:04.227 "is_configured": true, 00:13:04.227 "data_offset": 2048, 00:13:04.227 "data_size": 63488 00:13:04.227 } 00:13:04.227 ] 00:13:04.227 }' 00:13:04.227 09:17:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.227 09:17:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.790 09:17:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:04.790 09:17:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:04.790 [2024-07-15 09:17:13.602064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f6c70 00:13:05.721 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:05.980 [2024-07-15 09:17:14.758715] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:05.980 [2024-07-15 09:17:14.758773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.980 [2024-07-15 09:17:14.758962] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20f6c70 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.980 09:17:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:06.239 09:17:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.239 "name": "raid_bdev1", 00:13:06.239 "uuid": "0849971c-15fd-44d7-ab33-fdb772cfa186", 00:13:06.239 "strip_size_kb": 0, 00:13:06.239 "state": "online", 00:13:06.239 "raid_level": "raid1", 00:13:06.239 "superblock": true, 00:13:06.239 "num_base_bdevs": 2, 00:13:06.239 "num_base_bdevs_discovered": 1, 00:13:06.239 "num_base_bdevs_operational": 1, 00:13:06.239 "base_bdevs_list": [ 00:13:06.239 { 00:13:06.239 "name": null, 00:13:06.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.239 "is_configured": false, 00:13:06.239 "data_offset": 2048, 00:13:06.239 "data_size": 63488 00:13:06.239 }, 00:13:06.239 { 00:13:06.239 "name": "BaseBdev2", 00:13:06.239 "uuid": "5b6c5069-2b5a-56a1-9acf-e7177fd56ff4", 00:13:06.239 "is_configured": true, 00:13:06.239 "data_offset": 2048, 00:13:06.239 "data_size": 63488 00:13:06.239 } 00:13:06.239 ] 00:13:06.239 }' 00:13:06.239 09:17:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.239 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.809 09:17:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:07.067 [2024-07-15 09:17:15.862904] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:07.067 [2024-07-15 09:17:15.862951] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:07.067 [2024-07-15 09:17:15.866076] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:07.067 [2024-07-15 09:17:15.866103] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:07.067 [2024-07-15 09:17:15.866154] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:07.067 [2024-07-15 09:17:15.866166] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20fb320 name raid_bdev1, state offline 00:13:07.067 0 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 103416 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 103416 ']' 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 103416 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 103416 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 103416' 00:13:07.067 killing process with pid 103416 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 103416 00:13:07.067 [2024-07-15 09:17:15.933711] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.067 09:17:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 103416 00:13:07.067 [2024-07-15 09:17:15.944965] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qKHwfVO8gc 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:07.327 00:13:07.327 real 0m6.195s 00:13:07.327 user 0m9.594s 00:13:07.327 sys 0m1.141s 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:07.327 09:17:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.327 ************************************ 00:13:07.327 END TEST raid_write_error_test 00:13:07.327 ************************************ 00:13:07.327 09:17:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:07.327 09:17:16 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:07.327 09:17:16 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:07.327 09:17:16 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:07.327 09:17:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:07.327 09:17:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:07.327 09:17:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.327 ************************************ 00:13:07.327 START TEST raid_state_function_test 00:13:07.327 ************************************ 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.327 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=104384 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 104384' 00:13:07.586 Process raid pid: 104384 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 104384 /var/tmp/spdk-raid.sock 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 104384 ']' 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:07.586 09:17:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.586 [2024-07-15 09:17:16.344261] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:13:07.586 [2024-07-15 09:17:16.344332] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:07.586 [2024-07-15 09:17:16.476758] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.845 [2024-07-15 09:17:16.582181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.845 [2024-07-15 09:17:16.651253] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.845 [2024-07-15 09:17:16.651288] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.485 09:17:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:08.485 09:17:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:08.485 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:08.744 [2024-07-15 09:17:17.439283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.744 [2024-07-15 09:17:17.439328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.744 [2024-07-15 09:17:17.439339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:08.744 [2024-07-15 09:17:17.439351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:08.744 [2024-07-15 09:17:17.439360] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:08.744 [2024-07-15 09:17:17.439371] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.744 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.004 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.004 "name": "Existed_Raid", 00:13:09.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.004 "strip_size_kb": 64, 00:13:09.004 "state": "configuring", 00:13:09.004 "raid_level": "raid0", 00:13:09.004 "superblock": false, 00:13:09.004 "num_base_bdevs": 3, 00:13:09.004 "num_base_bdevs_discovered": 0, 00:13:09.004 "num_base_bdevs_operational": 3, 00:13:09.004 "base_bdevs_list": [ 00:13:09.004 { 00:13:09.004 "name": "BaseBdev1", 00:13:09.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.004 "is_configured": false, 00:13:09.004 "data_offset": 0, 00:13:09.004 "data_size": 0 00:13:09.004 }, 00:13:09.004 { 00:13:09.004 "name": "BaseBdev2", 00:13:09.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.004 "is_configured": false, 00:13:09.004 "data_offset": 0, 00:13:09.004 "data_size": 0 00:13:09.004 }, 00:13:09.004 { 00:13:09.004 "name": "BaseBdev3", 00:13:09.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.004 "is_configured": false, 00:13:09.004 "data_offset": 0, 00:13:09.004 "data_size": 0 00:13:09.004 } 00:13:09.004 ] 00:13:09.004 }' 00:13:09.004 09:17:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.004 09:17:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.572 09:17:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.572 [2024-07-15 09:17:18.522009] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.572 [2024-07-15 09:17:18.522044] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1028a80 name Existed_Raid, state configuring 00:13:09.837 09:17:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:09.837 [2024-07-15 09:17:18.766680] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.837 [2024-07-15 09:17:18.766715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.837 [2024-07-15 09:17:18.766724] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.837 [2024-07-15 09:17:18.766736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.837 [2024-07-15 09:17:18.766745] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:09.837 [2024-07-15 09:17:18.766756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:09.837 09:17:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:10.096 [2024-07-15 09:17:19.021081] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.096 BaseBdev1 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.096 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.355 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.614 [ 00:13:10.614 { 00:13:10.614 "name": "BaseBdev1", 00:13:10.614 "aliases": [ 00:13:10.614 "7877af77-edef-4248-9e30-0e275c0d98ab" 00:13:10.614 ], 00:13:10.614 "product_name": "Malloc disk", 00:13:10.614 "block_size": 512, 00:13:10.614 "num_blocks": 65536, 00:13:10.614 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:10.614 "assigned_rate_limits": { 00:13:10.614 "rw_ios_per_sec": 0, 00:13:10.614 "rw_mbytes_per_sec": 0, 00:13:10.614 "r_mbytes_per_sec": 0, 00:13:10.614 "w_mbytes_per_sec": 0 00:13:10.614 }, 00:13:10.614 "claimed": true, 00:13:10.614 "claim_type": "exclusive_write", 00:13:10.614 "zoned": false, 00:13:10.614 "supported_io_types": { 00:13:10.614 "read": true, 00:13:10.614 "write": true, 00:13:10.615 "unmap": true, 00:13:10.615 "flush": true, 00:13:10.615 "reset": true, 00:13:10.615 "nvme_admin": false, 00:13:10.615 "nvme_io": false, 00:13:10.615 "nvme_io_md": false, 00:13:10.615 "write_zeroes": true, 00:13:10.615 "zcopy": true, 00:13:10.615 "get_zone_info": false, 00:13:10.615 "zone_management": false, 00:13:10.615 "zone_append": false, 00:13:10.615 "compare": false, 00:13:10.615 "compare_and_write": false, 00:13:10.615 "abort": true, 00:13:10.615 "seek_hole": false, 00:13:10.615 "seek_data": false, 00:13:10.615 "copy": true, 00:13:10.615 "nvme_iov_md": false 00:13:10.615 }, 00:13:10.615 "memory_domains": [ 00:13:10.615 { 00:13:10.615 "dma_device_id": "system", 00:13:10.615 "dma_device_type": 1 00:13:10.615 }, 00:13:10.615 { 00:13:10.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.615 "dma_device_type": 2 00:13:10.615 } 00:13:10.615 ], 00:13:10.615 "driver_specific": {} 00:13:10.615 } 00:13:10.615 ] 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.615 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.874 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.874 "name": "Existed_Raid", 00:13:10.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.874 "strip_size_kb": 64, 00:13:10.874 "state": "configuring", 00:13:10.874 "raid_level": "raid0", 00:13:10.874 "superblock": false, 00:13:10.874 "num_base_bdevs": 3, 00:13:10.874 "num_base_bdevs_discovered": 1, 00:13:10.874 "num_base_bdevs_operational": 3, 00:13:10.874 "base_bdevs_list": [ 00:13:10.874 { 00:13:10.874 "name": "BaseBdev1", 00:13:10.874 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:10.874 "is_configured": true, 00:13:10.874 "data_offset": 0, 00:13:10.874 "data_size": 65536 00:13:10.874 }, 00:13:10.874 { 00:13:10.874 "name": "BaseBdev2", 00:13:10.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.874 "is_configured": false, 00:13:10.874 "data_offset": 0, 00:13:10.874 "data_size": 0 00:13:10.874 }, 00:13:10.874 { 00:13:10.874 "name": "BaseBdev3", 00:13:10.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.874 "is_configured": false, 00:13:10.874 "data_offset": 0, 00:13:10.874 "data_size": 0 00:13:10.874 } 00:13:10.874 ] 00:13:10.874 }' 00:13:10.874 09:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.874 09:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.442 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.700 [2024-07-15 09:17:20.617325] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.700 [2024-07-15 09:17:20.617377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1028310 name Existed_Raid, state configuring 00:13:11.700 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:11.959 [2024-07-15 09:17:20.866002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.959 [2024-07-15 09:17:20.867536] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.959 [2024-07-15 09:17:20.867570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.959 [2024-07-15 09:17:20.867580] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:11.959 [2024-07-15 09:17:20.867592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.960 09:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.217 09:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.217 "name": "Existed_Raid", 00:13:12.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.217 "strip_size_kb": 64, 00:13:12.217 "state": "configuring", 00:13:12.217 "raid_level": "raid0", 00:13:12.217 "superblock": false, 00:13:12.217 "num_base_bdevs": 3, 00:13:12.217 "num_base_bdevs_discovered": 1, 00:13:12.217 "num_base_bdevs_operational": 3, 00:13:12.217 "base_bdevs_list": [ 00:13:12.217 { 00:13:12.217 "name": "BaseBdev1", 00:13:12.217 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:12.217 "is_configured": true, 00:13:12.217 "data_offset": 0, 00:13:12.217 "data_size": 65536 00:13:12.217 }, 00:13:12.217 { 00:13:12.217 "name": "BaseBdev2", 00:13:12.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.218 "is_configured": false, 00:13:12.218 "data_offset": 0, 00:13:12.218 "data_size": 0 00:13:12.218 }, 00:13:12.218 { 00:13:12.218 "name": "BaseBdev3", 00:13:12.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.218 "is_configured": false, 00:13:12.218 "data_offset": 0, 00:13:12.218 "data_size": 0 00:13:12.218 } 00:13:12.218 ] 00:13:12.218 }' 00:13:12.218 09:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.218 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:13.153 [2024-07-15 09:17:21.904237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:13.153 BaseBdev2 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:13.153 09:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.153 09:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:13.411 [ 00:13:13.411 { 00:13:13.411 "name": "BaseBdev2", 00:13:13.411 "aliases": [ 00:13:13.411 "1eb5da6e-f63b-4f17-ad4f-04003227ff92" 00:13:13.411 ], 00:13:13.411 "product_name": "Malloc disk", 00:13:13.411 "block_size": 512, 00:13:13.411 "num_blocks": 65536, 00:13:13.411 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:13.411 "assigned_rate_limits": { 00:13:13.411 "rw_ios_per_sec": 0, 00:13:13.411 "rw_mbytes_per_sec": 0, 00:13:13.411 "r_mbytes_per_sec": 0, 00:13:13.411 "w_mbytes_per_sec": 0 00:13:13.411 }, 00:13:13.411 "claimed": true, 00:13:13.411 "claim_type": "exclusive_write", 00:13:13.411 "zoned": false, 00:13:13.411 "supported_io_types": { 00:13:13.411 "read": true, 00:13:13.411 "write": true, 00:13:13.411 "unmap": true, 00:13:13.411 "flush": true, 00:13:13.411 "reset": true, 00:13:13.411 "nvme_admin": false, 00:13:13.411 "nvme_io": false, 00:13:13.411 "nvme_io_md": false, 00:13:13.411 "write_zeroes": true, 00:13:13.411 "zcopy": true, 00:13:13.411 "get_zone_info": false, 00:13:13.411 "zone_management": false, 00:13:13.411 "zone_append": false, 00:13:13.411 "compare": false, 00:13:13.411 "compare_and_write": false, 00:13:13.411 "abort": true, 00:13:13.411 "seek_hole": false, 00:13:13.411 "seek_data": false, 00:13:13.411 "copy": true, 00:13:13.411 "nvme_iov_md": false 00:13:13.411 }, 00:13:13.411 "memory_domains": [ 00:13:13.411 { 00:13:13.411 "dma_device_id": "system", 00:13:13.411 "dma_device_type": 1 00:13:13.411 }, 00:13:13.411 { 00:13:13.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.411 "dma_device_type": 2 00:13:13.411 } 00:13:13.411 ], 00:13:13.411 "driver_specific": {} 00:13:13.411 } 00:13:13.411 ] 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.411 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.669 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.669 "name": "Existed_Raid", 00:13:13.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.669 "strip_size_kb": 64, 00:13:13.669 "state": "configuring", 00:13:13.669 "raid_level": "raid0", 00:13:13.669 "superblock": false, 00:13:13.669 "num_base_bdevs": 3, 00:13:13.669 "num_base_bdevs_discovered": 2, 00:13:13.669 "num_base_bdevs_operational": 3, 00:13:13.669 "base_bdevs_list": [ 00:13:13.669 { 00:13:13.669 "name": "BaseBdev1", 00:13:13.669 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:13.669 "is_configured": true, 00:13:13.669 "data_offset": 0, 00:13:13.669 "data_size": 65536 00:13:13.669 }, 00:13:13.669 { 00:13:13.669 "name": "BaseBdev2", 00:13:13.669 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:13.669 "is_configured": true, 00:13:13.669 "data_offset": 0, 00:13:13.669 "data_size": 65536 00:13:13.669 }, 00:13:13.669 { 00:13:13.669 "name": "BaseBdev3", 00:13:13.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.669 "is_configured": false, 00:13:13.669 "data_offset": 0, 00:13:13.669 "data_size": 0 00:13:13.669 } 00:13:13.669 ] 00:13:13.669 }' 00:13:13.669 09:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.669 09:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.234 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:14.495 [2024-07-15 09:17:23.303658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:14.495 [2024-07-15 09:17:23.303708] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1029400 00:13:14.495 [2024-07-15 09:17:23.303717] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:14.495 [2024-07-15 09:17:23.303985] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1028ef0 00:13:14.495 [2024-07-15 09:17:23.304112] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1029400 00:13:14.495 [2024-07-15 09:17:23.304122] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1029400 00:13:14.495 [2024-07-15 09:17:23.304282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:14.495 BaseBdev3 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:14.495 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.752 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:15.011 [ 00:13:15.011 { 00:13:15.011 "name": "BaseBdev3", 00:13:15.011 "aliases": [ 00:13:15.011 "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a" 00:13:15.011 ], 00:13:15.011 "product_name": "Malloc disk", 00:13:15.011 "block_size": 512, 00:13:15.011 "num_blocks": 65536, 00:13:15.011 "uuid": "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a", 00:13:15.011 "assigned_rate_limits": { 00:13:15.011 "rw_ios_per_sec": 0, 00:13:15.011 "rw_mbytes_per_sec": 0, 00:13:15.011 "r_mbytes_per_sec": 0, 00:13:15.011 "w_mbytes_per_sec": 0 00:13:15.011 }, 00:13:15.011 "claimed": true, 00:13:15.011 "claim_type": "exclusive_write", 00:13:15.011 "zoned": false, 00:13:15.011 "supported_io_types": { 00:13:15.011 "read": true, 00:13:15.011 "write": true, 00:13:15.011 "unmap": true, 00:13:15.011 "flush": true, 00:13:15.011 "reset": true, 00:13:15.011 "nvme_admin": false, 00:13:15.011 "nvme_io": false, 00:13:15.011 "nvme_io_md": false, 00:13:15.011 "write_zeroes": true, 00:13:15.011 "zcopy": true, 00:13:15.011 "get_zone_info": false, 00:13:15.011 "zone_management": false, 00:13:15.011 "zone_append": false, 00:13:15.011 "compare": false, 00:13:15.011 "compare_and_write": false, 00:13:15.011 "abort": true, 00:13:15.011 "seek_hole": false, 00:13:15.011 "seek_data": false, 00:13:15.011 "copy": true, 00:13:15.011 "nvme_iov_md": false 00:13:15.011 }, 00:13:15.011 "memory_domains": [ 00:13:15.011 { 00:13:15.011 "dma_device_id": "system", 00:13:15.011 "dma_device_type": 1 00:13:15.011 }, 00:13:15.011 { 00:13:15.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.011 "dma_device_type": 2 00:13:15.011 } 00:13:15.011 ], 00:13:15.011 "driver_specific": {} 00:13:15.011 } 00:13:15.011 ] 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.011 09:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.269 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.269 "name": "Existed_Raid", 00:13:15.269 "uuid": "63d20bf1-b563-41be-8d5c-1124459c91ce", 00:13:15.269 "strip_size_kb": 64, 00:13:15.269 "state": "online", 00:13:15.269 "raid_level": "raid0", 00:13:15.269 "superblock": false, 00:13:15.269 "num_base_bdevs": 3, 00:13:15.269 "num_base_bdevs_discovered": 3, 00:13:15.269 "num_base_bdevs_operational": 3, 00:13:15.269 "base_bdevs_list": [ 00:13:15.269 { 00:13:15.269 "name": "BaseBdev1", 00:13:15.269 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:15.269 "is_configured": true, 00:13:15.269 "data_offset": 0, 00:13:15.269 "data_size": 65536 00:13:15.269 }, 00:13:15.269 { 00:13:15.269 "name": "BaseBdev2", 00:13:15.269 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:15.269 "is_configured": true, 00:13:15.269 "data_offset": 0, 00:13:15.269 "data_size": 65536 00:13:15.269 }, 00:13:15.269 { 00:13:15.269 "name": "BaseBdev3", 00:13:15.269 "uuid": "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a", 00:13:15.269 "is_configured": true, 00:13:15.269 "data_offset": 0, 00:13:15.269 "data_size": 65536 00:13:15.269 } 00:13:15.269 ] 00:13:15.269 }' 00:13:15.269 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.269 09:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:15.836 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:16.094 [2024-07-15 09:17:24.824015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:16.094 "name": "Existed_Raid", 00:13:16.094 "aliases": [ 00:13:16.094 "63d20bf1-b563-41be-8d5c-1124459c91ce" 00:13:16.094 ], 00:13:16.094 "product_name": "Raid Volume", 00:13:16.094 "block_size": 512, 00:13:16.094 "num_blocks": 196608, 00:13:16.094 "uuid": "63d20bf1-b563-41be-8d5c-1124459c91ce", 00:13:16.094 "assigned_rate_limits": { 00:13:16.094 "rw_ios_per_sec": 0, 00:13:16.094 "rw_mbytes_per_sec": 0, 00:13:16.094 "r_mbytes_per_sec": 0, 00:13:16.094 "w_mbytes_per_sec": 0 00:13:16.094 }, 00:13:16.094 "claimed": false, 00:13:16.094 "zoned": false, 00:13:16.094 "supported_io_types": { 00:13:16.094 "read": true, 00:13:16.094 "write": true, 00:13:16.094 "unmap": true, 00:13:16.094 "flush": true, 00:13:16.094 "reset": true, 00:13:16.094 "nvme_admin": false, 00:13:16.094 "nvme_io": false, 00:13:16.094 "nvme_io_md": false, 00:13:16.094 "write_zeroes": true, 00:13:16.094 "zcopy": false, 00:13:16.094 "get_zone_info": false, 00:13:16.094 "zone_management": false, 00:13:16.094 "zone_append": false, 00:13:16.094 "compare": false, 00:13:16.094 "compare_and_write": false, 00:13:16.094 "abort": false, 00:13:16.094 "seek_hole": false, 00:13:16.094 "seek_data": false, 00:13:16.094 "copy": false, 00:13:16.094 "nvme_iov_md": false 00:13:16.094 }, 00:13:16.094 "memory_domains": [ 00:13:16.094 { 00:13:16.094 "dma_device_id": "system", 00:13:16.094 "dma_device_type": 1 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.094 "dma_device_type": 2 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "dma_device_id": "system", 00:13:16.094 "dma_device_type": 1 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.094 "dma_device_type": 2 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "dma_device_id": "system", 00:13:16.094 "dma_device_type": 1 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.094 "dma_device_type": 2 00:13:16.094 } 00:13:16.094 ], 00:13:16.094 "driver_specific": { 00:13:16.094 "raid": { 00:13:16.094 "uuid": "63d20bf1-b563-41be-8d5c-1124459c91ce", 00:13:16.094 "strip_size_kb": 64, 00:13:16.094 "state": "online", 00:13:16.094 "raid_level": "raid0", 00:13:16.094 "superblock": false, 00:13:16.094 "num_base_bdevs": 3, 00:13:16.094 "num_base_bdevs_discovered": 3, 00:13:16.094 "num_base_bdevs_operational": 3, 00:13:16.094 "base_bdevs_list": [ 00:13:16.094 { 00:13:16.094 "name": "BaseBdev1", 00:13:16.094 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:16.094 "is_configured": true, 00:13:16.094 "data_offset": 0, 00:13:16.094 "data_size": 65536 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "name": "BaseBdev2", 00:13:16.094 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:16.094 "is_configured": true, 00:13:16.094 "data_offset": 0, 00:13:16.094 "data_size": 65536 00:13:16.094 }, 00:13:16.094 { 00:13:16.094 "name": "BaseBdev3", 00:13:16.094 "uuid": "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a", 00:13:16.094 "is_configured": true, 00:13:16.094 "data_offset": 0, 00:13:16.094 "data_size": 65536 00:13:16.094 } 00:13:16.094 ] 00:13:16.094 } 00:13:16.094 } 00:13:16.094 }' 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:16.094 BaseBdev2 00:13:16.094 BaseBdev3' 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:16.094 09:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:16.352 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.352 "name": "BaseBdev1", 00:13:16.352 "aliases": [ 00:13:16.352 "7877af77-edef-4248-9e30-0e275c0d98ab" 00:13:16.352 ], 00:13:16.352 "product_name": "Malloc disk", 00:13:16.352 "block_size": 512, 00:13:16.352 "num_blocks": 65536, 00:13:16.352 "uuid": "7877af77-edef-4248-9e30-0e275c0d98ab", 00:13:16.352 "assigned_rate_limits": { 00:13:16.352 "rw_ios_per_sec": 0, 00:13:16.352 "rw_mbytes_per_sec": 0, 00:13:16.352 "r_mbytes_per_sec": 0, 00:13:16.352 "w_mbytes_per_sec": 0 00:13:16.352 }, 00:13:16.352 "claimed": true, 00:13:16.352 "claim_type": "exclusive_write", 00:13:16.352 "zoned": false, 00:13:16.352 "supported_io_types": { 00:13:16.352 "read": true, 00:13:16.352 "write": true, 00:13:16.352 "unmap": true, 00:13:16.352 "flush": true, 00:13:16.352 "reset": true, 00:13:16.352 "nvme_admin": false, 00:13:16.352 "nvme_io": false, 00:13:16.352 "nvme_io_md": false, 00:13:16.352 "write_zeroes": true, 00:13:16.352 "zcopy": true, 00:13:16.352 "get_zone_info": false, 00:13:16.352 "zone_management": false, 00:13:16.352 "zone_append": false, 00:13:16.352 "compare": false, 00:13:16.352 "compare_and_write": false, 00:13:16.352 "abort": true, 00:13:16.352 "seek_hole": false, 00:13:16.352 "seek_data": false, 00:13:16.352 "copy": true, 00:13:16.352 "nvme_iov_md": false 00:13:16.352 }, 00:13:16.352 "memory_domains": [ 00:13:16.352 { 00:13:16.352 "dma_device_id": "system", 00:13:16.352 "dma_device_type": 1 00:13:16.353 }, 00:13:16.353 { 00:13:16.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.353 "dma_device_type": 2 00:13:16.353 } 00:13:16.353 ], 00:13:16.353 "driver_specific": {} 00:13:16.353 }' 00:13:16.353 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.353 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.353 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.353 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.353 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:16.611 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:16.870 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.870 "name": "BaseBdev2", 00:13:16.870 "aliases": [ 00:13:16.870 "1eb5da6e-f63b-4f17-ad4f-04003227ff92" 00:13:16.870 ], 00:13:16.870 "product_name": "Malloc disk", 00:13:16.870 "block_size": 512, 00:13:16.870 "num_blocks": 65536, 00:13:16.870 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:16.870 "assigned_rate_limits": { 00:13:16.870 "rw_ios_per_sec": 0, 00:13:16.870 "rw_mbytes_per_sec": 0, 00:13:16.870 "r_mbytes_per_sec": 0, 00:13:16.870 "w_mbytes_per_sec": 0 00:13:16.870 }, 00:13:16.870 "claimed": true, 00:13:16.870 "claim_type": "exclusive_write", 00:13:16.870 "zoned": false, 00:13:16.870 "supported_io_types": { 00:13:16.870 "read": true, 00:13:16.870 "write": true, 00:13:16.870 "unmap": true, 00:13:16.870 "flush": true, 00:13:16.870 "reset": true, 00:13:16.870 "nvme_admin": false, 00:13:16.870 "nvme_io": false, 00:13:16.870 "nvme_io_md": false, 00:13:16.870 "write_zeroes": true, 00:13:16.870 "zcopy": true, 00:13:16.870 "get_zone_info": false, 00:13:16.870 "zone_management": false, 00:13:16.870 "zone_append": false, 00:13:16.870 "compare": false, 00:13:16.870 "compare_and_write": false, 00:13:16.870 "abort": true, 00:13:16.870 "seek_hole": false, 00:13:16.870 "seek_data": false, 00:13:16.870 "copy": true, 00:13:16.870 "nvme_iov_md": false 00:13:16.870 }, 00:13:16.870 "memory_domains": [ 00:13:16.870 { 00:13:16.870 "dma_device_id": "system", 00:13:16.870 "dma_device_type": 1 00:13:16.870 }, 00:13:16.870 { 00:13:16.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.870 "dma_device_type": 2 00:13:16.870 } 00:13:16.870 ], 00:13:16.870 "driver_specific": {} 00:13:16.870 }' 00:13:16.870 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.870 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.128 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.128 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.128 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.128 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.128 09:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.128 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.128 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:17.386 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:17.645 "name": "BaseBdev3", 00:13:17.645 "aliases": [ 00:13:17.645 "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a" 00:13:17.645 ], 00:13:17.645 "product_name": "Malloc disk", 00:13:17.645 "block_size": 512, 00:13:17.645 "num_blocks": 65536, 00:13:17.645 "uuid": "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a", 00:13:17.645 "assigned_rate_limits": { 00:13:17.645 "rw_ios_per_sec": 0, 00:13:17.645 "rw_mbytes_per_sec": 0, 00:13:17.645 "r_mbytes_per_sec": 0, 00:13:17.645 "w_mbytes_per_sec": 0 00:13:17.645 }, 00:13:17.645 "claimed": true, 00:13:17.645 "claim_type": "exclusive_write", 00:13:17.645 "zoned": false, 00:13:17.645 "supported_io_types": { 00:13:17.645 "read": true, 00:13:17.645 "write": true, 00:13:17.645 "unmap": true, 00:13:17.645 "flush": true, 00:13:17.645 "reset": true, 00:13:17.645 "nvme_admin": false, 00:13:17.645 "nvme_io": false, 00:13:17.645 "nvme_io_md": false, 00:13:17.645 "write_zeroes": true, 00:13:17.645 "zcopy": true, 00:13:17.645 "get_zone_info": false, 00:13:17.645 "zone_management": false, 00:13:17.645 "zone_append": false, 00:13:17.645 "compare": false, 00:13:17.645 "compare_and_write": false, 00:13:17.645 "abort": true, 00:13:17.645 "seek_hole": false, 00:13:17.645 "seek_data": false, 00:13:17.645 "copy": true, 00:13:17.645 "nvme_iov_md": false 00:13:17.645 }, 00:13:17.645 "memory_domains": [ 00:13:17.645 { 00:13:17.645 "dma_device_id": "system", 00:13:17.645 "dma_device_type": 1 00:13:17.645 }, 00:13:17.645 { 00:13:17.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.645 "dma_device_type": 2 00:13:17.645 } 00:13:17.645 ], 00:13:17.645 "driver_specific": {} 00:13:17.645 }' 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.645 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.904 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:18.163 [2024-07-15 09:17:26.925356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:18.163 [2024-07-15 09:17:26.925386] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:18.163 [2024-07-15 09:17:26.925429] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.163 09:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.422 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.422 "name": "Existed_Raid", 00:13:18.422 "uuid": "63d20bf1-b563-41be-8d5c-1124459c91ce", 00:13:18.422 "strip_size_kb": 64, 00:13:18.422 "state": "offline", 00:13:18.422 "raid_level": "raid0", 00:13:18.422 "superblock": false, 00:13:18.422 "num_base_bdevs": 3, 00:13:18.422 "num_base_bdevs_discovered": 2, 00:13:18.422 "num_base_bdevs_operational": 2, 00:13:18.422 "base_bdevs_list": [ 00:13:18.422 { 00:13:18.422 "name": null, 00:13:18.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.422 "is_configured": false, 00:13:18.422 "data_offset": 0, 00:13:18.422 "data_size": 65536 00:13:18.422 }, 00:13:18.422 { 00:13:18.422 "name": "BaseBdev2", 00:13:18.422 "uuid": "1eb5da6e-f63b-4f17-ad4f-04003227ff92", 00:13:18.422 "is_configured": true, 00:13:18.422 "data_offset": 0, 00:13:18.422 "data_size": 65536 00:13:18.422 }, 00:13:18.422 { 00:13:18.422 "name": "BaseBdev3", 00:13:18.422 "uuid": "b2a95042-73e8-4585-86b1-ad5d6eb3bf0a", 00:13:18.422 "is_configured": true, 00:13:18.422 "data_offset": 0, 00:13:18.422 "data_size": 65536 00:13:18.422 } 00:13:18.422 ] 00:13:18.422 }' 00:13:18.422 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.422 09:17:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.989 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:18.989 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:18.989 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.989 09:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:19.247 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:19.247 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:19.247 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:19.814 [2024-07-15 09:17:28.503553] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:19.814 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:19.814 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:19.814 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:19.814 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.073 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:20.073 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:20.073 09:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:20.331 [2024-07-15 09:17:29.278012] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:20.331 [2024-07-15 09:17:29.278056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1029400 name Existed_Raid, state offline 00:13:20.590 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:20.590 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:20.590 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.590 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:20.848 09:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:21.106 BaseBdev2 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:21.364 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:21.623 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:21.881 [ 00:13:21.881 { 00:13:21.881 "name": "BaseBdev2", 00:13:21.881 "aliases": [ 00:13:21.881 "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148" 00:13:21.881 ], 00:13:21.881 "product_name": "Malloc disk", 00:13:21.881 "block_size": 512, 00:13:21.881 "num_blocks": 65536, 00:13:21.881 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:21.881 "assigned_rate_limits": { 00:13:21.881 "rw_ios_per_sec": 0, 00:13:21.881 "rw_mbytes_per_sec": 0, 00:13:21.881 "r_mbytes_per_sec": 0, 00:13:21.881 "w_mbytes_per_sec": 0 00:13:21.881 }, 00:13:21.881 "claimed": false, 00:13:21.881 "zoned": false, 00:13:21.881 "supported_io_types": { 00:13:21.881 "read": true, 00:13:21.881 "write": true, 00:13:21.881 "unmap": true, 00:13:21.881 "flush": true, 00:13:21.881 "reset": true, 00:13:21.881 "nvme_admin": false, 00:13:21.881 "nvme_io": false, 00:13:21.881 "nvme_io_md": false, 00:13:21.881 "write_zeroes": true, 00:13:21.881 "zcopy": true, 00:13:21.881 "get_zone_info": false, 00:13:21.881 "zone_management": false, 00:13:21.881 "zone_append": false, 00:13:21.881 "compare": false, 00:13:21.881 "compare_and_write": false, 00:13:21.881 "abort": true, 00:13:21.881 "seek_hole": false, 00:13:21.881 "seek_data": false, 00:13:21.881 "copy": true, 00:13:21.881 "nvme_iov_md": false 00:13:21.881 }, 00:13:21.881 "memory_domains": [ 00:13:21.881 { 00:13:21.881 "dma_device_id": "system", 00:13:21.881 "dma_device_type": 1 00:13:21.881 }, 00:13:21.881 { 00:13:21.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.881 "dma_device_type": 2 00:13:21.881 } 00:13:21.881 ], 00:13:21.881 "driver_specific": {} 00:13:21.881 } 00:13:21.881 ] 00:13:21.881 09:17:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:21.881 09:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:21.881 09:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:21.881 09:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:22.139 BaseBdev3 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:22.139 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:22.397 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:22.961 [ 00:13:22.961 { 00:13:22.961 "name": "BaseBdev3", 00:13:22.961 "aliases": [ 00:13:22.961 "80212b4d-9358-402e-887b-8535c5a65bf0" 00:13:22.961 ], 00:13:22.961 "product_name": "Malloc disk", 00:13:22.961 "block_size": 512, 00:13:22.961 "num_blocks": 65536, 00:13:22.961 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:22.961 "assigned_rate_limits": { 00:13:22.961 "rw_ios_per_sec": 0, 00:13:22.961 "rw_mbytes_per_sec": 0, 00:13:22.961 "r_mbytes_per_sec": 0, 00:13:22.961 "w_mbytes_per_sec": 0 00:13:22.961 }, 00:13:22.961 "claimed": false, 00:13:22.961 "zoned": false, 00:13:22.961 "supported_io_types": { 00:13:22.961 "read": true, 00:13:22.961 "write": true, 00:13:22.961 "unmap": true, 00:13:22.961 "flush": true, 00:13:22.961 "reset": true, 00:13:22.961 "nvme_admin": false, 00:13:22.961 "nvme_io": false, 00:13:22.961 "nvme_io_md": false, 00:13:22.961 "write_zeroes": true, 00:13:22.961 "zcopy": true, 00:13:22.961 "get_zone_info": false, 00:13:22.961 "zone_management": false, 00:13:22.961 "zone_append": false, 00:13:22.961 "compare": false, 00:13:22.961 "compare_and_write": false, 00:13:22.961 "abort": true, 00:13:22.961 "seek_hole": false, 00:13:22.961 "seek_data": false, 00:13:22.961 "copy": true, 00:13:22.961 "nvme_iov_md": false 00:13:22.961 }, 00:13:22.961 "memory_domains": [ 00:13:22.961 { 00:13:22.961 "dma_device_id": "system", 00:13:22.961 "dma_device_type": 1 00:13:22.961 }, 00:13:22.961 { 00:13:22.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.961 "dma_device_type": 2 00:13:22.961 } 00:13:22.961 ], 00:13:22.961 "driver_specific": {} 00:13:22.961 } 00:13:22.961 ] 00:13:22.961 09:17:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:22.961 09:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:22.961 09:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:22.961 09:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:23.217 [2024-07-15 09:17:32.020896] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:23.217 [2024-07-15 09:17:32.020953] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:23.217 [2024-07-15 09:17:32.020976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:23.217 [2024-07-15 09:17:32.022349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.217 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.474 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.474 "name": "Existed_Raid", 00:13:23.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.474 "strip_size_kb": 64, 00:13:23.474 "state": "configuring", 00:13:23.474 "raid_level": "raid0", 00:13:23.474 "superblock": false, 00:13:23.474 "num_base_bdevs": 3, 00:13:23.474 "num_base_bdevs_discovered": 2, 00:13:23.474 "num_base_bdevs_operational": 3, 00:13:23.474 "base_bdevs_list": [ 00:13:23.474 { 00:13:23.474 "name": "BaseBdev1", 00:13:23.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.474 "is_configured": false, 00:13:23.474 "data_offset": 0, 00:13:23.474 "data_size": 0 00:13:23.474 }, 00:13:23.474 { 00:13:23.474 "name": "BaseBdev2", 00:13:23.474 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:23.474 "is_configured": true, 00:13:23.474 "data_offset": 0, 00:13:23.474 "data_size": 65536 00:13:23.474 }, 00:13:23.474 { 00:13:23.474 "name": "BaseBdev3", 00:13:23.474 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:23.474 "is_configured": true, 00:13:23.474 "data_offset": 0, 00:13:23.474 "data_size": 65536 00:13:23.474 } 00:13:23.474 ] 00:13:23.474 }' 00:13:23.474 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.474 09:17:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.039 09:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:24.295 [2024-07-15 09:17:33.099748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.295 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.553 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.553 "name": "Existed_Raid", 00:13:24.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.553 "strip_size_kb": 64, 00:13:24.553 "state": "configuring", 00:13:24.553 "raid_level": "raid0", 00:13:24.553 "superblock": false, 00:13:24.553 "num_base_bdevs": 3, 00:13:24.553 "num_base_bdevs_discovered": 1, 00:13:24.553 "num_base_bdevs_operational": 3, 00:13:24.553 "base_bdevs_list": [ 00:13:24.553 { 00:13:24.553 "name": "BaseBdev1", 00:13:24.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.553 "is_configured": false, 00:13:24.553 "data_offset": 0, 00:13:24.553 "data_size": 0 00:13:24.553 }, 00:13:24.553 { 00:13:24.553 "name": null, 00:13:24.553 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:24.553 "is_configured": false, 00:13:24.553 "data_offset": 0, 00:13:24.553 "data_size": 65536 00:13:24.553 }, 00:13:24.553 { 00:13:24.553 "name": "BaseBdev3", 00:13:24.553 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:24.553 "is_configured": true, 00:13:24.553 "data_offset": 0, 00:13:24.553 "data_size": 65536 00:13:24.553 } 00:13:24.553 ] 00:13:24.553 }' 00:13:24.553 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.553 09:17:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.152 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.152 09:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:25.410 09:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:25.410 09:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:25.976 [2024-07-15 09:17:34.704584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:25.976 BaseBdev1 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:25.976 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:26.234 09:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:26.799 [ 00:13:26.799 { 00:13:26.799 "name": "BaseBdev1", 00:13:26.799 "aliases": [ 00:13:26.799 "269e0eb4-c907-499d-ad11-501ad3d63571" 00:13:26.799 ], 00:13:26.799 "product_name": "Malloc disk", 00:13:26.799 "block_size": 512, 00:13:26.799 "num_blocks": 65536, 00:13:26.799 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:26.799 "assigned_rate_limits": { 00:13:26.799 "rw_ios_per_sec": 0, 00:13:26.799 "rw_mbytes_per_sec": 0, 00:13:26.799 "r_mbytes_per_sec": 0, 00:13:26.799 "w_mbytes_per_sec": 0 00:13:26.799 }, 00:13:26.799 "claimed": true, 00:13:26.799 "claim_type": "exclusive_write", 00:13:26.799 "zoned": false, 00:13:26.799 "supported_io_types": { 00:13:26.799 "read": true, 00:13:26.799 "write": true, 00:13:26.799 "unmap": true, 00:13:26.799 "flush": true, 00:13:26.799 "reset": true, 00:13:26.799 "nvme_admin": false, 00:13:26.799 "nvme_io": false, 00:13:26.799 "nvme_io_md": false, 00:13:26.799 "write_zeroes": true, 00:13:26.799 "zcopy": true, 00:13:26.799 "get_zone_info": false, 00:13:26.799 "zone_management": false, 00:13:26.799 "zone_append": false, 00:13:26.799 "compare": false, 00:13:26.799 "compare_and_write": false, 00:13:26.799 "abort": true, 00:13:26.799 "seek_hole": false, 00:13:26.799 "seek_data": false, 00:13:26.799 "copy": true, 00:13:26.799 "nvme_iov_md": false 00:13:26.799 }, 00:13:26.799 "memory_domains": [ 00:13:26.799 { 00:13:26.799 "dma_device_id": "system", 00:13:26.799 "dma_device_type": 1 00:13:26.799 }, 00:13:26.799 { 00:13:26.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.799 "dma_device_type": 2 00:13:26.799 } 00:13:26.799 ], 00:13:26.799 "driver_specific": {} 00:13:26.799 } 00:13:26.799 ] 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.800 "name": "Existed_Raid", 00:13:26.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.800 "strip_size_kb": 64, 00:13:26.800 "state": "configuring", 00:13:26.800 "raid_level": "raid0", 00:13:26.800 "superblock": false, 00:13:26.800 "num_base_bdevs": 3, 00:13:26.800 "num_base_bdevs_discovered": 2, 00:13:26.800 "num_base_bdevs_operational": 3, 00:13:26.800 "base_bdevs_list": [ 00:13:26.800 { 00:13:26.800 "name": "BaseBdev1", 00:13:26.800 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:26.800 "is_configured": true, 00:13:26.800 "data_offset": 0, 00:13:26.800 "data_size": 65536 00:13:26.800 }, 00:13:26.800 { 00:13:26.800 "name": null, 00:13:26.800 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:26.800 "is_configured": false, 00:13:26.800 "data_offset": 0, 00:13:26.800 "data_size": 65536 00:13:26.800 }, 00:13:26.800 { 00:13:26.800 "name": "BaseBdev3", 00:13:26.800 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:26.800 "is_configured": true, 00:13:26.800 "data_offset": 0, 00:13:26.800 "data_size": 65536 00:13:26.800 } 00:13:26.800 ] 00:13:26.800 }' 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.800 09:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.363 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.363 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:27.619 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:27.619 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:27.876 [2024-07-15 09:17:36.774129] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.876 09:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.132 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.132 "name": "Existed_Raid", 00:13:28.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.132 "strip_size_kb": 64, 00:13:28.132 "state": "configuring", 00:13:28.132 "raid_level": "raid0", 00:13:28.132 "superblock": false, 00:13:28.132 "num_base_bdevs": 3, 00:13:28.132 "num_base_bdevs_discovered": 1, 00:13:28.132 "num_base_bdevs_operational": 3, 00:13:28.132 "base_bdevs_list": [ 00:13:28.132 { 00:13:28.132 "name": "BaseBdev1", 00:13:28.132 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:28.132 "is_configured": true, 00:13:28.132 "data_offset": 0, 00:13:28.132 "data_size": 65536 00:13:28.132 }, 00:13:28.132 { 00:13:28.132 "name": null, 00:13:28.132 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:28.132 "is_configured": false, 00:13:28.132 "data_offset": 0, 00:13:28.132 "data_size": 65536 00:13:28.132 }, 00:13:28.132 { 00:13:28.132 "name": null, 00:13:28.132 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:28.132 "is_configured": false, 00:13:28.132 "data_offset": 0, 00:13:28.132 "data_size": 65536 00:13:28.132 } 00:13:28.132 ] 00:13:28.132 }' 00:13:28.132 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.132 09:17:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.696 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.696 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:28.953 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:28.953 09:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:29.210 [2024-07-15 09:17:38.117715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.210 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.467 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.467 "name": "Existed_Raid", 00:13:29.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.467 "strip_size_kb": 64, 00:13:29.467 "state": "configuring", 00:13:29.467 "raid_level": "raid0", 00:13:29.467 "superblock": false, 00:13:29.467 "num_base_bdevs": 3, 00:13:29.467 "num_base_bdevs_discovered": 2, 00:13:29.467 "num_base_bdevs_operational": 3, 00:13:29.467 "base_bdevs_list": [ 00:13:29.467 { 00:13:29.467 "name": "BaseBdev1", 00:13:29.467 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:29.467 "is_configured": true, 00:13:29.467 "data_offset": 0, 00:13:29.467 "data_size": 65536 00:13:29.467 }, 00:13:29.467 { 00:13:29.467 "name": null, 00:13:29.467 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:29.467 "is_configured": false, 00:13:29.467 "data_offset": 0, 00:13:29.467 "data_size": 65536 00:13:29.467 }, 00:13:29.467 { 00:13:29.467 "name": "BaseBdev3", 00:13:29.467 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:29.467 "is_configured": true, 00:13:29.467 "data_offset": 0, 00:13:29.467 "data_size": 65536 00:13:29.467 } 00:13:29.467 ] 00:13:29.467 }' 00:13:29.467 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.467 09:17:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.032 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.032 09:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:30.290 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:30.290 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:30.549 [2024-07-15 09:17:39.304891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.549 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.808 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.808 "name": "Existed_Raid", 00:13:30.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.808 "strip_size_kb": 64, 00:13:30.808 "state": "configuring", 00:13:30.808 "raid_level": "raid0", 00:13:30.808 "superblock": false, 00:13:30.808 "num_base_bdevs": 3, 00:13:30.808 "num_base_bdevs_discovered": 1, 00:13:30.808 "num_base_bdevs_operational": 3, 00:13:30.808 "base_bdevs_list": [ 00:13:30.808 { 00:13:30.808 "name": null, 00:13:30.808 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:30.808 "is_configured": false, 00:13:30.808 "data_offset": 0, 00:13:30.808 "data_size": 65536 00:13:30.808 }, 00:13:30.808 { 00:13:30.808 "name": null, 00:13:30.808 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:30.808 "is_configured": false, 00:13:30.808 "data_offset": 0, 00:13:30.808 "data_size": 65536 00:13:30.808 }, 00:13:30.808 { 00:13:30.808 "name": "BaseBdev3", 00:13:30.808 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:30.808 "is_configured": true, 00:13:30.808 "data_offset": 0, 00:13:30.808 "data_size": 65536 00:13:30.808 } 00:13:30.808 ] 00:13:30.808 }' 00:13:30.808 09:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.808 09:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.374 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.374 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:31.632 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:31.632 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:31.890 [2024-07-15 09:17:40.680989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.890 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.148 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.148 "name": "Existed_Raid", 00:13:32.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.148 "strip_size_kb": 64, 00:13:32.148 "state": "configuring", 00:13:32.148 "raid_level": "raid0", 00:13:32.148 "superblock": false, 00:13:32.148 "num_base_bdevs": 3, 00:13:32.148 "num_base_bdevs_discovered": 2, 00:13:32.148 "num_base_bdevs_operational": 3, 00:13:32.148 "base_bdevs_list": [ 00:13:32.148 { 00:13:32.148 "name": null, 00:13:32.148 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:32.148 "is_configured": false, 00:13:32.148 "data_offset": 0, 00:13:32.148 "data_size": 65536 00:13:32.148 }, 00:13:32.148 { 00:13:32.148 "name": "BaseBdev2", 00:13:32.148 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:32.148 "is_configured": true, 00:13:32.148 "data_offset": 0, 00:13:32.148 "data_size": 65536 00:13:32.149 }, 00:13:32.149 { 00:13:32.149 "name": "BaseBdev3", 00:13:32.149 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:32.149 "is_configured": true, 00:13:32.149 "data_offset": 0, 00:13:32.149 "data_size": 65536 00:13:32.149 } 00:13:32.149 ] 00:13:32.149 }' 00:13:32.149 09:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.149 09:17:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.714 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:32.714 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.972 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:32.972 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.972 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:33.231 09:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 269e0eb4-c907-499d-ad11-501ad3d63571 00:13:33.489 [2024-07-15 09:17:42.217590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:33.489 [2024-07-15 09:17:42.217633] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1027450 00:13:33.489 [2024-07-15 09:17:42.217642] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:33.489 [2024-07-15 09:17:42.217840] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1028a50 00:13:33.489 [2024-07-15 09:17:42.217968] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1027450 00:13:33.489 [2024-07-15 09:17:42.217978] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1027450 00:13:33.489 [2024-07-15 09:17:42.218149] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:33.489 NewBaseBdev 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:33.489 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:33.747 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:34.005 [ 00:13:34.005 { 00:13:34.005 "name": "NewBaseBdev", 00:13:34.005 "aliases": [ 00:13:34.005 "269e0eb4-c907-499d-ad11-501ad3d63571" 00:13:34.005 ], 00:13:34.005 "product_name": "Malloc disk", 00:13:34.005 "block_size": 512, 00:13:34.005 "num_blocks": 65536, 00:13:34.005 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:34.005 "assigned_rate_limits": { 00:13:34.005 "rw_ios_per_sec": 0, 00:13:34.005 "rw_mbytes_per_sec": 0, 00:13:34.005 "r_mbytes_per_sec": 0, 00:13:34.005 "w_mbytes_per_sec": 0 00:13:34.005 }, 00:13:34.005 "claimed": true, 00:13:34.005 "claim_type": "exclusive_write", 00:13:34.005 "zoned": false, 00:13:34.005 "supported_io_types": { 00:13:34.005 "read": true, 00:13:34.005 "write": true, 00:13:34.005 "unmap": true, 00:13:34.005 "flush": true, 00:13:34.006 "reset": true, 00:13:34.006 "nvme_admin": false, 00:13:34.006 "nvme_io": false, 00:13:34.006 "nvme_io_md": false, 00:13:34.006 "write_zeroes": true, 00:13:34.006 "zcopy": true, 00:13:34.006 "get_zone_info": false, 00:13:34.006 "zone_management": false, 00:13:34.006 "zone_append": false, 00:13:34.006 "compare": false, 00:13:34.006 "compare_and_write": false, 00:13:34.006 "abort": true, 00:13:34.006 "seek_hole": false, 00:13:34.006 "seek_data": false, 00:13:34.006 "copy": true, 00:13:34.006 "nvme_iov_md": false 00:13:34.006 }, 00:13:34.006 "memory_domains": [ 00:13:34.006 { 00:13:34.006 "dma_device_id": "system", 00:13:34.006 "dma_device_type": 1 00:13:34.006 }, 00:13:34.006 { 00:13:34.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.006 "dma_device_type": 2 00:13:34.006 } 00:13:34.006 ], 00:13:34.006 "driver_specific": {} 00:13:34.006 } 00:13:34.006 ] 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.006 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.264 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.264 "name": "Existed_Raid", 00:13:34.264 "uuid": "d9b7e8a9-740c-451a-8a77-bce2a70f1720", 00:13:34.264 "strip_size_kb": 64, 00:13:34.264 "state": "online", 00:13:34.264 "raid_level": "raid0", 00:13:34.264 "superblock": false, 00:13:34.264 "num_base_bdevs": 3, 00:13:34.264 "num_base_bdevs_discovered": 3, 00:13:34.264 "num_base_bdevs_operational": 3, 00:13:34.264 "base_bdevs_list": [ 00:13:34.264 { 00:13:34.264 "name": "NewBaseBdev", 00:13:34.264 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:34.264 "is_configured": true, 00:13:34.264 "data_offset": 0, 00:13:34.264 "data_size": 65536 00:13:34.264 }, 00:13:34.264 { 00:13:34.264 "name": "BaseBdev2", 00:13:34.264 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:34.264 "is_configured": true, 00:13:34.264 "data_offset": 0, 00:13:34.264 "data_size": 65536 00:13:34.264 }, 00:13:34.264 { 00:13:34.264 "name": "BaseBdev3", 00:13:34.264 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:34.264 "is_configured": true, 00:13:34.264 "data_offset": 0, 00:13:34.264 "data_size": 65536 00:13:34.264 } 00:13:34.264 ] 00:13:34.264 }' 00:13:34.264 09:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.264 09:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:34.828 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:34.828 [2024-07-15 09:17:43.774045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:35.086 "name": "Existed_Raid", 00:13:35.086 "aliases": [ 00:13:35.086 "d9b7e8a9-740c-451a-8a77-bce2a70f1720" 00:13:35.086 ], 00:13:35.086 "product_name": "Raid Volume", 00:13:35.086 "block_size": 512, 00:13:35.086 "num_blocks": 196608, 00:13:35.086 "uuid": "d9b7e8a9-740c-451a-8a77-bce2a70f1720", 00:13:35.086 "assigned_rate_limits": { 00:13:35.086 "rw_ios_per_sec": 0, 00:13:35.086 "rw_mbytes_per_sec": 0, 00:13:35.086 "r_mbytes_per_sec": 0, 00:13:35.086 "w_mbytes_per_sec": 0 00:13:35.086 }, 00:13:35.086 "claimed": false, 00:13:35.086 "zoned": false, 00:13:35.086 "supported_io_types": { 00:13:35.086 "read": true, 00:13:35.086 "write": true, 00:13:35.086 "unmap": true, 00:13:35.086 "flush": true, 00:13:35.086 "reset": true, 00:13:35.086 "nvme_admin": false, 00:13:35.086 "nvme_io": false, 00:13:35.086 "nvme_io_md": false, 00:13:35.086 "write_zeroes": true, 00:13:35.086 "zcopy": false, 00:13:35.086 "get_zone_info": false, 00:13:35.086 "zone_management": false, 00:13:35.086 "zone_append": false, 00:13:35.086 "compare": false, 00:13:35.086 "compare_and_write": false, 00:13:35.086 "abort": false, 00:13:35.086 "seek_hole": false, 00:13:35.086 "seek_data": false, 00:13:35.086 "copy": false, 00:13:35.086 "nvme_iov_md": false 00:13:35.086 }, 00:13:35.086 "memory_domains": [ 00:13:35.086 { 00:13:35.086 "dma_device_id": "system", 00:13:35.086 "dma_device_type": 1 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.086 "dma_device_type": 2 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "dma_device_id": "system", 00:13:35.086 "dma_device_type": 1 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.086 "dma_device_type": 2 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "dma_device_id": "system", 00:13:35.086 "dma_device_type": 1 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.086 "dma_device_type": 2 00:13:35.086 } 00:13:35.086 ], 00:13:35.086 "driver_specific": { 00:13:35.086 "raid": { 00:13:35.086 "uuid": "d9b7e8a9-740c-451a-8a77-bce2a70f1720", 00:13:35.086 "strip_size_kb": 64, 00:13:35.086 "state": "online", 00:13:35.086 "raid_level": "raid0", 00:13:35.086 "superblock": false, 00:13:35.086 "num_base_bdevs": 3, 00:13:35.086 "num_base_bdevs_discovered": 3, 00:13:35.086 "num_base_bdevs_operational": 3, 00:13:35.086 "base_bdevs_list": [ 00:13:35.086 { 00:13:35.086 "name": "NewBaseBdev", 00:13:35.086 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:35.086 "is_configured": true, 00:13:35.086 "data_offset": 0, 00:13:35.086 "data_size": 65536 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "name": "BaseBdev2", 00:13:35.086 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:35.086 "is_configured": true, 00:13:35.086 "data_offset": 0, 00:13:35.086 "data_size": 65536 00:13:35.086 }, 00:13:35.086 { 00:13:35.086 "name": "BaseBdev3", 00:13:35.086 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:35.086 "is_configured": true, 00:13:35.086 "data_offset": 0, 00:13:35.086 "data_size": 65536 00:13:35.086 } 00:13:35.086 ] 00:13:35.086 } 00:13:35.086 } 00:13:35.086 }' 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:35.086 BaseBdev2 00:13:35.086 BaseBdev3' 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:35.086 09:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:35.345 "name": "NewBaseBdev", 00:13:35.345 "aliases": [ 00:13:35.345 "269e0eb4-c907-499d-ad11-501ad3d63571" 00:13:35.345 ], 00:13:35.345 "product_name": "Malloc disk", 00:13:35.345 "block_size": 512, 00:13:35.345 "num_blocks": 65536, 00:13:35.345 "uuid": "269e0eb4-c907-499d-ad11-501ad3d63571", 00:13:35.345 "assigned_rate_limits": { 00:13:35.345 "rw_ios_per_sec": 0, 00:13:35.345 "rw_mbytes_per_sec": 0, 00:13:35.345 "r_mbytes_per_sec": 0, 00:13:35.345 "w_mbytes_per_sec": 0 00:13:35.345 }, 00:13:35.345 "claimed": true, 00:13:35.345 "claim_type": "exclusive_write", 00:13:35.345 "zoned": false, 00:13:35.345 "supported_io_types": { 00:13:35.345 "read": true, 00:13:35.345 "write": true, 00:13:35.345 "unmap": true, 00:13:35.345 "flush": true, 00:13:35.345 "reset": true, 00:13:35.345 "nvme_admin": false, 00:13:35.345 "nvme_io": false, 00:13:35.345 "nvme_io_md": false, 00:13:35.345 "write_zeroes": true, 00:13:35.345 "zcopy": true, 00:13:35.345 "get_zone_info": false, 00:13:35.345 "zone_management": false, 00:13:35.345 "zone_append": false, 00:13:35.345 "compare": false, 00:13:35.345 "compare_and_write": false, 00:13:35.345 "abort": true, 00:13:35.345 "seek_hole": false, 00:13:35.345 "seek_data": false, 00:13:35.345 "copy": true, 00:13:35.345 "nvme_iov_md": false 00:13:35.345 }, 00:13:35.345 "memory_domains": [ 00:13:35.345 { 00:13:35.345 "dma_device_id": "system", 00:13:35.345 "dma_device_type": 1 00:13:35.345 }, 00:13:35.345 { 00:13:35.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.345 "dma_device_type": 2 00:13:35.345 } 00:13:35.345 ], 00:13:35.345 "driver_specific": {} 00:13:35.345 }' 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:35.345 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:35.604 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:35.863 "name": "BaseBdev2", 00:13:35.863 "aliases": [ 00:13:35.863 "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148" 00:13:35.863 ], 00:13:35.863 "product_name": "Malloc disk", 00:13:35.863 "block_size": 512, 00:13:35.863 "num_blocks": 65536, 00:13:35.863 "uuid": "d511c4b2-fdd8-41c4-92b7-cf6a2ef17148", 00:13:35.863 "assigned_rate_limits": { 00:13:35.863 "rw_ios_per_sec": 0, 00:13:35.863 "rw_mbytes_per_sec": 0, 00:13:35.863 "r_mbytes_per_sec": 0, 00:13:35.863 "w_mbytes_per_sec": 0 00:13:35.863 }, 00:13:35.863 "claimed": true, 00:13:35.863 "claim_type": "exclusive_write", 00:13:35.863 "zoned": false, 00:13:35.863 "supported_io_types": { 00:13:35.863 "read": true, 00:13:35.863 "write": true, 00:13:35.863 "unmap": true, 00:13:35.863 "flush": true, 00:13:35.863 "reset": true, 00:13:35.863 "nvme_admin": false, 00:13:35.863 "nvme_io": false, 00:13:35.863 "nvme_io_md": false, 00:13:35.863 "write_zeroes": true, 00:13:35.863 "zcopy": true, 00:13:35.863 "get_zone_info": false, 00:13:35.863 "zone_management": false, 00:13:35.863 "zone_append": false, 00:13:35.863 "compare": false, 00:13:35.863 "compare_and_write": false, 00:13:35.863 "abort": true, 00:13:35.863 "seek_hole": false, 00:13:35.863 "seek_data": false, 00:13:35.863 "copy": true, 00:13:35.863 "nvme_iov_md": false 00:13:35.863 }, 00:13:35.863 "memory_domains": [ 00:13:35.863 { 00:13:35.863 "dma_device_id": "system", 00:13:35.863 "dma_device_type": 1 00:13:35.863 }, 00:13:35.863 { 00:13:35.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.863 "dma_device_type": 2 00:13:35.863 } 00:13:35.863 ], 00:13:35.863 "driver_specific": {} 00:13:35.863 }' 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:35.863 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.121 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.121 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.121 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.121 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.121 09:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:36.121 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.121 "name": "BaseBdev3", 00:13:36.121 "aliases": [ 00:13:36.121 "80212b4d-9358-402e-887b-8535c5a65bf0" 00:13:36.121 ], 00:13:36.121 "product_name": "Malloc disk", 00:13:36.121 "block_size": 512, 00:13:36.121 "num_blocks": 65536, 00:13:36.121 "uuid": "80212b4d-9358-402e-887b-8535c5a65bf0", 00:13:36.121 "assigned_rate_limits": { 00:13:36.121 "rw_ios_per_sec": 0, 00:13:36.121 "rw_mbytes_per_sec": 0, 00:13:36.121 "r_mbytes_per_sec": 0, 00:13:36.121 "w_mbytes_per_sec": 0 00:13:36.121 }, 00:13:36.121 "claimed": true, 00:13:36.121 "claim_type": "exclusive_write", 00:13:36.121 "zoned": false, 00:13:36.121 "supported_io_types": { 00:13:36.121 "read": true, 00:13:36.121 "write": true, 00:13:36.121 "unmap": true, 00:13:36.121 "flush": true, 00:13:36.121 "reset": true, 00:13:36.121 "nvme_admin": false, 00:13:36.121 "nvme_io": false, 00:13:36.121 "nvme_io_md": false, 00:13:36.121 "write_zeroes": true, 00:13:36.121 "zcopy": true, 00:13:36.121 "get_zone_info": false, 00:13:36.121 "zone_management": false, 00:13:36.121 "zone_append": false, 00:13:36.121 "compare": false, 00:13:36.121 "compare_and_write": false, 00:13:36.121 "abort": true, 00:13:36.121 "seek_hole": false, 00:13:36.121 "seek_data": false, 00:13:36.121 "copy": true, 00:13:36.121 "nvme_iov_md": false 00:13:36.121 }, 00:13:36.121 "memory_domains": [ 00:13:36.121 { 00:13:36.121 "dma_device_id": "system", 00:13:36.121 "dma_device_type": 1 00:13:36.121 }, 00:13:36.121 { 00:13:36.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.121 "dma_device_type": 2 00:13:36.121 } 00:13:36.121 ], 00:13:36.121 "driver_specific": {} 00:13:36.121 }' 00:13:36.121 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.379 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.637 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.637 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.637 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:36.895 [2024-07-15 09:17:45.626652] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:36.895 [2024-07-15 09:17:45.626681] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:36.895 [2024-07-15 09:17:45.626734] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:36.895 [2024-07-15 09:17:45.626784] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:36.895 [2024-07-15 09:17:45.626796] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1027450 name Existed_Raid, state offline 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 104384 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 104384 ']' 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 104384 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 104384 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 104384' 00:13:36.895 killing process with pid 104384 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 104384 00:13:36.895 [2024-07-15 09:17:45.696335] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:36.895 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 104384 00:13:36.896 [2024-07-15 09:17:45.723824] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.154 09:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:37.154 00:13:37.154 real 0m29.650s 00:13:37.154 user 0m54.314s 00:13:37.154 sys 0m5.291s 00:13:37.154 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:37.154 09:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.154 ************************************ 00:13:37.154 END TEST raid_state_function_test 00:13:37.154 ************************************ 00:13:37.154 09:17:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:37.154 09:17:45 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:37.154 09:17:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:37.154 09:17:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:37.154 09:17:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.154 ************************************ 00:13:37.154 START TEST raid_state_function_test_sb 00:13:37.154 ************************************ 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=108859 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 108859' 00:13:37.154 Process raid pid: 108859 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 108859 /var/tmp/spdk-raid.sock 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 108859 ']' 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:37.154 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.154 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.155 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:37.155 09:17:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:37.155 [2024-07-15 09:17:46.076581] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:13:37.155 [2024-07-15 09:17:46.076647] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:37.413 [2024-07-15 09:17:46.208084] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:37.413 [2024-07-15 09:17:46.313006] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:37.671 [2024-07-15 09:17:46.377682] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:37.671 [2024-07-15 09:17:46.377714] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.237 09:17:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:38.237 09:17:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:38.237 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:38.496 [2024-07-15 09:17:47.220565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:38.496 [2024-07-15 09:17:47.220606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:38.496 [2024-07-15 09:17:47.220617] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:38.496 [2024-07-15 09:17:47.220630] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:38.496 [2024-07-15 09:17:47.220638] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:38.496 [2024-07-15 09:17:47.220649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.496 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.761 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.761 "name": "Existed_Raid", 00:13:38.761 "uuid": "b59bcce5-09b4-49df-a221-c0cd8c82fd2a", 00:13:38.761 "strip_size_kb": 64, 00:13:38.761 "state": "configuring", 00:13:38.761 "raid_level": "raid0", 00:13:38.761 "superblock": true, 00:13:38.761 "num_base_bdevs": 3, 00:13:38.761 "num_base_bdevs_discovered": 0, 00:13:38.761 "num_base_bdevs_operational": 3, 00:13:38.761 "base_bdevs_list": [ 00:13:38.761 { 00:13:38.761 "name": "BaseBdev1", 00:13:38.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.761 "is_configured": false, 00:13:38.761 "data_offset": 0, 00:13:38.761 "data_size": 0 00:13:38.761 }, 00:13:38.761 { 00:13:38.761 "name": "BaseBdev2", 00:13:38.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.761 "is_configured": false, 00:13:38.761 "data_offset": 0, 00:13:38.761 "data_size": 0 00:13:38.761 }, 00:13:38.761 { 00:13:38.761 "name": "BaseBdev3", 00:13:38.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.761 "is_configured": false, 00:13:38.761 "data_offset": 0, 00:13:38.761 "data_size": 0 00:13:38.761 } 00:13:38.761 ] 00:13:38.761 }' 00:13:38.761 09:17:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.761 09:17:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:39.327 09:17:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:39.586 [2024-07-15 09:17:48.299281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:39.586 [2024-07-15 09:17:48.299309] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d0a80 name Existed_Raid, state configuring 00:13:39.586 09:17:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:39.586 [2024-07-15 09:17:48.487807] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:39.586 [2024-07-15 09:17:48.487834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:39.586 [2024-07-15 09:17:48.487844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:39.586 [2024-07-15 09:17:48.487855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:39.586 [2024-07-15 09:17:48.487864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:39.586 [2024-07-15 09:17:48.487875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:39.586 09:17:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:39.844 [2024-07-15 09:17:48.742236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:39.844 BaseBdev1 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:39.844 09:17:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:40.103 09:17:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:40.361 [ 00:13:40.361 { 00:13:40.361 "name": "BaseBdev1", 00:13:40.361 "aliases": [ 00:13:40.361 "8525a651-5cb4-4486-ab0d-14839bd5740c" 00:13:40.361 ], 00:13:40.361 "product_name": "Malloc disk", 00:13:40.361 "block_size": 512, 00:13:40.361 "num_blocks": 65536, 00:13:40.361 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:40.361 "assigned_rate_limits": { 00:13:40.361 "rw_ios_per_sec": 0, 00:13:40.361 "rw_mbytes_per_sec": 0, 00:13:40.361 "r_mbytes_per_sec": 0, 00:13:40.361 "w_mbytes_per_sec": 0 00:13:40.361 }, 00:13:40.361 "claimed": true, 00:13:40.361 "claim_type": "exclusive_write", 00:13:40.361 "zoned": false, 00:13:40.361 "supported_io_types": { 00:13:40.361 "read": true, 00:13:40.361 "write": true, 00:13:40.361 "unmap": true, 00:13:40.361 "flush": true, 00:13:40.361 "reset": true, 00:13:40.361 "nvme_admin": false, 00:13:40.361 "nvme_io": false, 00:13:40.361 "nvme_io_md": false, 00:13:40.361 "write_zeroes": true, 00:13:40.361 "zcopy": true, 00:13:40.361 "get_zone_info": false, 00:13:40.361 "zone_management": false, 00:13:40.361 "zone_append": false, 00:13:40.361 "compare": false, 00:13:40.361 "compare_and_write": false, 00:13:40.361 "abort": true, 00:13:40.361 "seek_hole": false, 00:13:40.361 "seek_data": false, 00:13:40.361 "copy": true, 00:13:40.361 "nvme_iov_md": false 00:13:40.361 }, 00:13:40.361 "memory_domains": [ 00:13:40.361 { 00:13:40.361 "dma_device_id": "system", 00:13:40.361 "dma_device_type": 1 00:13:40.361 }, 00:13:40.361 { 00:13:40.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.361 "dma_device_type": 2 00:13:40.362 } 00:13:40.362 ], 00:13:40.362 "driver_specific": {} 00:13:40.362 } 00:13:40.362 ] 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.362 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.620 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.620 "name": "Existed_Raid", 00:13:40.620 "uuid": "e0cb17c9-f42c-4ea8-b22a-b8dd23a12f6c", 00:13:40.620 "strip_size_kb": 64, 00:13:40.620 "state": "configuring", 00:13:40.620 "raid_level": "raid0", 00:13:40.620 "superblock": true, 00:13:40.620 "num_base_bdevs": 3, 00:13:40.620 "num_base_bdevs_discovered": 1, 00:13:40.620 "num_base_bdevs_operational": 3, 00:13:40.620 "base_bdevs_list": [ 00:13:40.620 { 00:13:40.620 "name": "BaseBdev1", 00:13:40.620 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:40.620 "is_configured": true, 00:13:40.620 "data_offset": 2048, 00:13:40.620 "data_size": 63488 00:13:40.620 }, 00:13:40.620 { 00:13:40.620 "name": "BaseBdev2", 00:13:40.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.620 "is_configured": false, 00:13:40.620 "data_offset": 0, 00:13:40.620 "data_size": 0 00:13:40.620 }, 00:13:40.620 { 00:13:40.620 "name": "BaseBdev3", 00:13:40.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.620 "is_configured": false, 00:13:40.620 "data_offset": 0, 00:13:40.620 "data_size": 0 00:13:40.620 } 00:13:40.620 ] 00:13:40.620 }' 00:13:40.620 09:17:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.620 09:17:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:41.184 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:41.443 [2024-07-15 09:17:50.326458] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:41.443 [2024-07-15 09:17:50.326501] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d0310 name Existed_Raid, state configuring 00:13:41.443 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:41.740 [2024-07-15 09:17:50.571151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:41.740 [2024-07-15 09:17:50.572595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:41.740 [2024-07-15 09:17:50.572627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:41.740 [2024-07-15 09:17:50.572637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:41.740 [2024-07-15 09:17:50.572649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.740 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.998 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.998 "name": "Existed_Raid", 00:13:41.998 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:41.998 "strip_size_kb": 64, 00:13:41.998 "state": "configuring", 00:13:41.998 "raid_level": "raid0", 00:13:41.998 "superblock": true, 00:13:41.998 "num_base_bdevs": 3, 00:13:41.998 "num_base_bdevs_discovered": 1, 00:13:41.998 "num_base_bdevs_operational": 3, 00:13:41.998 "base_bdevs_list": [ 00:13:41.998 { 00:13:41.998 "name": "BaseBdev1", 00:13:41.998 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:41.998 "is_configured": true, 00:13:41.998 "data_offset": 2048, 00:13:41.998 "data_size": 63488 00:13:41.998 }, 00:13:41.998 { 00:13:41.998 "name": "BaseBdev2", 00:13:41.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.998 "is_configured": false, 00:13:41.998 "data_offset": 0, 00:13:41.998 "data_size": 0 00:13:41.998 }, 00:13:41.998 { 00:13:41.998 "name": "BaseBdev3", 00:13:41.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.998 "is_configured": false, 00:13:41.998 "data_offset": 0, 00:13:41.998 "data_size": 0 00:13:41.998 } 00:13:41.998 ] 00:13:41.998 }' 00:13:41.998 09:17:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.998 09:17:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:42.564 09:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:42.823 [2024-07-15 09:17:51.637351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:42.823 BaseBdev2 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:42.823 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:43.081 09:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:43.339 [ 00:13:43.339 { 00:13:43.339 "name": "BaseBdev2", 00:13:43.339 "aliases": [ 00:13:43.339 "aca1483c-8454-4c0b-b01c-3cdc11229ba2" 00:13:43.339 ], 00:13:43.339 "product_name": "Malloc disk", 00:13:43.339 "block_size": 512, 00:13:43.339 "num_blocks": 65536, 00:13:43.339 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:43.339 "assigned_rate_limits": { 00:13:43.339 "rw_ios_per_sec": 0, 00:13:43.339 "rw_mbytes_per_sec": 0, 00:13:43.339 "r_mbytes_per_sec": 0, 00:13:43.339 "w_mbytes_per_sec": 0 00:13:43.339 }, 00:13:43.339 "claimed": true, 00:13:43.339 "claim_type": "exclusive_write", 00:13:43.339 "zoned": false, 00:13:43.339 "supported_io_types": { 00:13:43.339 "read": true, 00:13:43.339 "write": true, 00:13:43.339 "unmap": true, 00:13:43.339 "flush": true, 00:13:43.339 "reset": true, 00:13:43.339 "nvme_admin": false, 00:13:43.339 "nvme_io": false, 00:13:43.339 "nvme_io_md": false, 00:13:43.339 "write_zeroes": true, 00:13:43.339 "zcopy": true, 00:13:43.339 "get_zone_info": false, 00:13:43.339 "zone_management": false, 00:13:43.339 "zone_append": false, 00:13:43.339 "compare": false, 00:13:43.339 "compare_and_write": false, 00:13:43.339 "abort": true, 00:13:43.339 "seek_hole": false, 00:13:43.339 "seek_data": false, 00:13:43.339 "copy": true, 00:13:43.339 "nvme_iov_md": false 00:13:43.339 }, 00:13:43.339 "memory_domains": [ 00:13:43.339 { 00:13:43.339 "dma_device_id": "system", 00:13:43.339 "dma_device_type": 1 00:13:43.339 }, 00:13:43.339 { 00:13:43.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.339 "dma_device_type": 2 00:13:43.339 } 00:13:43.339 ], 00:13:43.339 "driver_specific": {} 00:13:43.339 } 00:13:43.339 ] 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.339 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.598 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.598 "name": "Existed_Raid", 00:13:43.598 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:43.598 "strip_size_kb": 64, 00:13:43.598 "state": "configuring", 00:13:43.598 "raid_level": "raid0", 00:13:43.598 "superblock": true, 00:13:43.598 "num_base_bdevs": 3, 00:13:43.598 "num_base_bdevs_discovered": 2, 00:13:43.598 "num_base_bdevs_operational": 3, 00:13:43.598 "base_bdevs_list": [ 00:13:43.598 { 00:13:43.598 "name": "BaseBdev1", 00:13:43.598 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:43.598 "is_configured": true, 00:13:43.598 "data_offset": 2048, 00:13:43.598 "data_size": 63488 00:13:43.598 }, 00:13:43.598 { 00:13:43.598 "name": "BaseBdev2", 00:13:43.598 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:43.598 "is_configured": true, 00:13:43.598 "data_offset": 2048, 00:13:43.598 "data_size": 63488 00:13:43.598 }, 00:13:43.598 { 00:13:43.598 "name": "BaseBdev3", 00:13:43.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.598 "is_configured": false, 00:13:43.598 "data_offset": 0, 00:13:43.598 "data_size": 0 00:13:43.598 } 00:13:43.598 ] 00:13:43.598 }' 00:13:43.598 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.598 09:17:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:44.165 09:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:44.165 [2024-07-15 09:17:53.084515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:44.165 [2024-07-15 09:17:53.084674] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19d1400 00:13:44.165 [2024-07-15 09:17:53.084688] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:44.165 [2024-07-15 09:17:53.084856] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19d0ef0 00:13:44.165 [2024-07-15 09:17:53.084979] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19d1400 00:13:44.165 [2024-07-15 09:17:53.084990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19d1400 00:13:44.165 [2024-07-15 09:17:53.085080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:44.165 BaseBdev3 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.165 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.426 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:44.687 [ 00:13:44.687 { 00:13:44.687 "name": "BaseBdev3", 00:13:44.687 "aliases": [ 00:13:44.687 "e40dfc39-904a-47bb-8594-095544d9043f" 00:13:44.687 ], 00:13:44.687 "product_name": "Malloc disk", 00:13:44.687 "block_size": 512, 00:13:44.687 "num_blocks": 65536, 00:13:44.687 "uuid": "e40dfc39-904a-47bb-8594-095544d9043f", 00:13:44.687 "assigned_rate_limits": { 00:13:44.687 "rw_ios_per_sec": 0, 00:13:44.687 "rw_mbytes_per_sec": 0, 00:13:44.687 "r_mbytes_per_sec": 0, 00:13:44.687 "w_mbytes_per_sec": 0 00:13:44.687 }, 00:13:44.687 "claimed": true, 00:13:44.687 "claim_type": "exclusive_write", 00:13:44.687 "zoned": false, 00:13:44.687 "supported_io_types": { 00:13:44.687 "read": true, 00:13:44.687 "write": true, 00:13:44.687 "unmap": true, 00:13:44.687 "flush": true, 00:13:44.687 "reset": true, 00:13:44.687 "nvme_admin": false, 00:13:44.687 "nvme_io": false, 00:13:44.687 "nvme_io_md": false, 00:13:44.687 "write_zeroes": true, 00:13:44.687 "zcopy": true, 00:13:44.687 "get_zone_info": false, 00:13:44.687 "zone_management": false, 00:13:44.687 "zone_append": false, 00:13:44.687 "compare": false, 00:13:44.687 "compare_and_write": false, 00:13:44.687 "abort": true, 00:13:44.687 "seek_hole": false, 00:13:44.687 "seek_data": false, 00:13:44.687 "copy": true, 00:13:44.687 "nvme_iov_md": false 00:13:44.687 }, 00:13:44.687 "memory_domains": [ 00:13:44.687 { 00:13:44.687 "dma_device_id": "system", 00:13:44.687 "dma_device_type": 1 00:13:44.687 }, 00:13:44.687 { 00:13:44.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.687 "dma_device_type": 2 00:13:44.687 } 00:13:44.687 ], 00:13:44.687 "driver_specific": {} 00:13:44.687 } 00:13:44.687 ] 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.687 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.945 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.945 "name": "Existed_Raid", 00:13:44.945 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:44.945 "strip_size_kb": 64, 00:13:44.945 "state": "online", 00:13:44.945 "raid_level": "raid0", 00:13:44.945 "superblock": true, 00:13:44.945 "num_base_bdevs": 3, 00:13:44.945 "num_base_bdevs_discovered": 3, 00:13:44.945 "num_base_bdevs_operational": 3, 00:13:44.945 "base_bdevs_list": [ 00:13:44.945 { 00:13:44.945 "name": "BaseBdev1", 00:13:44.945 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:44.945 "is_configured": true, 00:13:44.945 "data_offset": 2048, 00:13:44.945 "data_size": 63488 00:13:44.945 }, 00:13:44.945 { 00:13:44.945 "name": "BaseBdev2", 00:13:44.945 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:44.945 "is_configured": true, 00:13:44.945 "data_offset": 2048, 00:13:44.945 "data_size": 63488 00:13:44.945 }, 00:13:44.945 { 00:13:44.945 "name": "BaseBdev3", 00:13:44.945 "uuid": "e40dfc39-904a-47bb-8594-095544d9043f", 00:13:44.945 "is_configured": true, 00:13:44.945 "data_offset": 2048, 00:13:44.945 "data_size": 63488 00:13:44.945 } 00:13:44.945 ] 00:13:44.945 }' 00:13:44.945 09:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.945 09:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:45.511 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:45.769 [2024-07-15 09:17:54.600835] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:45.769 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:45.769 "name": "Existed_Raid", 00:13:45.769 "aliases": [ 00:13:45.769 "c5a2f833-2fbe-472b-8189-bd6f3b071693" 00:13:45.769 ], 00:13:45.769 "product_name": "Raid Volume", 00:13:45.769 "block_size": 512, 00:13:45.769 "num_blocks": 190464, 00:13:45.769 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:45.769 "assigned_rate_limits": { 00:13:45.769 "rw_ios_per_sec": 0, 00:13:45.769 "rw_mbytes_per_sec": 0, 00:13:45.769 "r_mbytes_per_sec": 0, 00:13:45.769 "w_mbytes_per_sec": 0 00:13:45.769 }, 00:13:45.769 "claimed": false, 00:13:45.769 "zoned": false, 00:13:45.769 "supported_io_types": { 00:13:45.769 "read": true, 00:13:45.769 "write": true, 00:13:45.769 "unmap": true, 00:13:45.769 "flush": true, 00:13:45.769 "reset": true, 00:13:45.769 "nvme_admin": false, 00:13:45.769 "nvme_io": false, 00:13:45.769 "nvme_io_md": false, 00:13:45.769 "write_zeroes": true, 00:13:45.769 "zcopy": false, 00:13:45.769 "get_zone_info": false, 00:13:45.769 "zone_management": false, 00:13:45.769 "zone_append": false, 00:13:45.769 "compare": false, 00:13:45.769 "compare_and_write": false, 00:13:45.769 "abort": false, 00:13:45.770 "seek_hole": false, 00:13:45.770 "seek_data": false, 00:13:45.770 "copy": false, 00:13:45.770 "nvme_iov_md": false 00:13:45.770 }, 00:13:45.770 "memory_domains": [ 00:13:45.770 { 00:13:45.770 "dma_device_id": "system", 00:13:45.770 "dma_device_type": 1 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.770 "dma_device_type": 2 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "dma_device_id": "system", 00:13:45.770 "dma_device_type": 1 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.770 "dma_device_type": 2 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "dma_device_id": "system", 00:13:45.770 "dma_device_type": 1 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.770 "dma_device_type": 2 00:13:45.770 } 00:13:45.770 ], 00:13:45.770 "driver_specific": { 00:13:45.770 "raid": { 00:13:45.770 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:45.770 "strip_size_kb": 64, 00:13:45.770 "state": "online", 00:13:45.770 "raid_level": "raid0", 00:13:45.770 "superblock": true, 00:13:45.770 "num_base_bdevs": 3, 00:13:45.770 "num_base_bdevs_discovered": 3, 00:13:45.770 "num_base_bdevs_operational": 3, 00:13:45.770 "base_bdevs_list": [ 00:13:45.770 { 00:13:45.770 "name": "BaseBdev1", 00:13:45.770 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:45.770 "is_configured": true, 00:13:45.770 "data_offset": 2048, 00:13:45.770 "data_size": 63488 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "name": "BaseBdev2", 00:13:45.770 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:45.770 "is_configured": true, 00:13:45.770 "data_offset": 2048, 00:13:45.770 "data_size": 63488 00:13:45.770 }, 00:13:45.770 { 00:13:45.770 "name": "BaseBdev3", 00:13:45.770 "uuid": "e40dfc39-904a-47bb-8594-095544d9043f", 00:13:45.770 "is_configured": true, 00:13:45.770 "data_offset": 2048, 00:13:45.770 "data_size": 63488 00:13:45.770 } 00:13:45.770 ] 00:13:45.770 } 00:13:45.770 } 00:13:45.770 }' 00:13:45.770 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:45.770 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:45.770 BaseBdev2 00:13:45.770 BaseBdev3' 00:13:45.770 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:45.770 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:45.770 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:46.028 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:46.028 "name": "BaseBdev1", 00:13:46.028 "aliases": [ 00:13:46.028 "8525a651-5cb4-4486-ab0d-14839bd5740c" 00:13:46.028 ], 00:13:46.028 "product_name": "Malloc disk", 00:13:46.028 "block_size": 512, 00:13:46.028 "num_blocks": 65536, 00:13:46.028 "uuid": "8525a651-5cb4-4486-ab0d-14839bd5740c", 00:13:46.028 "assigned_rate_limits": { 00:13:46.028 "rw_ios_per_sec": 0, 00:13:46.028 "rw_mbytes_per_sec": 0, 00:13:46.028 "r_mbytes_per_sec": 0, 00:13:46.028 "w_mbytes_per_sec": 0 00:13:46.028 }, 00:13:46.028 "claimed": true, 00:13:46.028 "claim_type": "exclusive_write", 00:13:46.028 "zoned": false, 00:13:46.028 "supported_io_types": { 00:13:46.028 "read": true, 00:13:46.028 "write": true, 00:13:46.028 "unmap": true, 00:13:46.028 "flush": true, 00:13:46.028 "reset": true, 00:13:46.028 "nvme_admin": false, 00:13:46.028 "nvme_io": false, 00:13:46.028 "nvme_io_md": false, 00:13:46.028 "write_zeroes": true, 00:13:46.028 "zcopy": true, 00:13:46.028 "get_zone_info": false, 00:13:46.028 "zone_management": false, 00:13:46.028 "zone_append": false, 00:13:46.028 "compare": false, 00:13:46.028 "compare_and_write": false, 00:13:46.028 "abort": true, 00:13:46.028 "seek_hole": false, 00:13:46.028 "seek_data": false, 00:13:46.028 "copy": true, 00:13:46.028 "nvme_iov_md": false 00:13:46.028 }, 00:13:46.028 "memory_domains": [ 00:13:46.028 { 00:13:46.028 "dma_device_id": "system", 00:13:46.028 "dma_device_type": 1 00:13:46.028 }, 00:13:46.028 { 00:13:46.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.028 "dma_device_type": 2 00:13:46.028 } 00:13:46.028 ], 00:13:46.028 "driver_specific": {} 00:13:46.028 }' 00:13:46.028 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.028 09:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:46.287 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:46.545 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:46.545 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:46.545 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:46.545 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:46.803 "name": "BaseBdev2", 00:13:46.803 "aliases": [ 00:13:46.803 "aca1483c-8454-4c0b-b01c-3cdc11229ba2" 00:13:46.803 ], 00:13:46.803 "product_name": "Malloc disk", 00:13:46.803 "block_size": 512, 00:13:46.803 "num_blocks": 65536, 00:13:46.803 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:46.803 "assigned_rate_limits": { 00:13:46.803 "rw_ios_per_sec": 0, 00:13:46.803 "rw_mbytes_per_sec": 0, 00:13:46.803 "r_mbytes_per_sec": 0, 00:13:46.803 "w_mbytes_per_sec": 0 00:13:46.803 }, 00:13:46.803 "claimed": true, 00:13:46.803 "claim_type": "exclusive_write", 00:13:46.803 "zoned": false, 00:13:46.803 "supported_io_types": { 00:13:46.803 "read": true, 00:13:46.803 "write": true, 00:13:46.803 "unmap": true, 00:13:46.803 "flush": true, 00:13:46.803 "reset": true, 00:13:46.803 "nvme_admin": false, 00:13:46.803 "nvme_io": false, 00:13:46.803 "nvme_io_md": false, 00:13:46.803 "write_zeroes": true, 00:13:46.803 "zcopy": true, 00:13:46.803 "get_zone_info": false, 00:13:46.803 "zone_management": false, 00:13:46.803 "zone_append": false, 00:13:46.803 "compare": false, 00:13:46.803 "compare_and_write": false, 00:13:46.803 "abort": true, 00:13:46.803 "seek_hole": false, 00:13:46.803 "seek_data": false, 00:13:46.803 "copy": true, 00:13:46.803 "nvme_iov_md": false 00:13:46.803 }, 00:13:46.803 "memory_domains": [ 00:13:46.803 { 00:13:46.803 "dma_device_id": "system", 00:13:46.803 "dma_device_type": 1 00:13:46.803 }, 00:13:46.803 { 00:13:46.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.803 "dma_device_type": 2 00:13:46.803 } 00:13:46.803 ], 00:13:46.803 "driver_specific": {} 00:13:46.803 }' 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.803 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:47.061 09:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.319 "name": "BaseBdev3", 00:13:47.319 "aliases": [ 00:13:47.319 "e40dfc39-904a-47bb-8594-095544d9043f" 00:13:47.319 ], 00:13:47.319 "product_name": "Malloc disk", 00:13:47.319 "block_size": 512, 00:13:47.319 "num_blocks": 65536, 00:13:47.319 "uuid": "e40dfc39-904a-47bb-8594-095544d9043f", 00:13:47.319 "assigned_rate_limits": { 00:13:47.319 "rw_ios_per_sec": 0, 00:13:47.319 "rw_mbytes_per_sec": 0, 00:13:47.319 "r_mbytes_per_sec": 0, 00:13:47.319 "w_mbytes_per_sec": 0 00:13:47.319 }, 00:13:47.319 "claimed": true, 00:13:47.319 "claim_type": "exclusive_write", 00:13:47.319 "zoned": false, 00:13:47.319 "supported_io_types": { 00:13:47.319 "read": true, 00:13:47.319 "write": true, 00:13:47.319 "unmap": true, 00:13:47.319 "flush": true, 00:13:47.319 "reset": true, 00:13:47.319 "nvme_admin": false, 00:13:47.319 "nvme_io": false, 00:13:47.319 "nvme_io_md": false, 00:13:47.319 "write_zeroes": true, 00:13:47.319 "zcopy": true, 00:13:47.319 "get_zone_info": false, 00:13:47.319 "zone_management": false, 00:13:47.319 "zone_append": false, 00:13:47.319 "compare": false, 00:13:47.319 "compare_and_write": false, 00:13:47.319 "abort": true, 00:13:47.319 "seek_hole": false, 00:13:47.319 "seek_data": false, 00:13:47.319 "copy": true, 00:13:47.319 "nvme_iov_md": false 00:13:47.319 }, 00:13:47.319 "memory_domains": [ 00:13:47.319 { 00:13:47.319 "dma_device_id": "system", 00:13:47.319 "dma_device_type": 1 00:13:47.319 }, 00:13:47.319 { 00:13:47.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.319 "dma_device_type": 2 00:13:47.319 } 00:13:47.319 ], 00:13:47.319 "driver_specific": {} 00:13:47.319 }' 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.319 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.577 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:47.835 [2024-07-15 09:17:56.714224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:47.835 [2024-07-15 09:17:56.714251] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:47.835 [2024-07-15 09:17:56.714294] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.835 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.093 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.093 "name": "Existed_Raid", 00:13:48.093 "uuid": "c5a2f833-2fbe-472b-8189-bd6f3b071693", 00:13:48.093 "strip_size_kb": 64, 00:13:48.093 "state": "offline", 00:13:48.093 "raid_level": "raid0", 00:13:48.093 "superblock": true, 00:13:48.093 "num_base_bdevs": 3, 00:13:48.093 "num_base_bdevs_discovered": 2, 00:13:48.093 "num_base_bdevs_operational": 2, 00:13:48.093 "base_bdevs_list": [ 00:13:48.093 { 00:13:48.093 "name": null, 00:13:48.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.093 "is_configured": false, 00:13:48.093 "data_offset": 2048, 00:13:48.093 "data_size": 63488 00:13:48.093 }, 00:13:48.093 { 00:13:48.093 "name": "BaseBdev2", 00:13:48.093 "uuid": "aca1483c-8454-4c0b-b01c-3cdc11229ba2", 00:13:48.093 "is_configured": true, 00:13:48.093 "data_offset": 2048, 00:13:48.093 "data_size": 63488 00:13:48.093 }, 00:13:48.093 { 00:13:48.093 "name": "BaseBdev3", 00:13:48.093 "uuid": "e40dfc39-904a-47bb-8594-095544d9043f", 00:13:48.093 "is_configured": true, 00:13:48.093 "data_offset": 2048, 00:13:48.093 "data_size": 63488 00:13:48.093 } 00:13:48.093 ] 00:13:48.093 }' 00:13:48.093 09:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.093 09:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.658 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:48.658 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:48.658 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.658 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:48.917 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:48.917 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:48.917 09:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:49.176 [2024-07-15 09:17:58.054803] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:49.176 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:49.176 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:49.176 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.176 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:49.434 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:49.434 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:49.434 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:49.692 [2024-07-15 09:17:58.554763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:49.692 [2024-07-15 09:17:58.554809] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d1400 name Existed_Raid, state offline 00:13:49.692 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:49.692 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:49.692 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.692 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:49.950 09:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:50.208 BaseBdev2 00:13:50.208 09:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:50.208 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:50.208 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:50.208 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:50.209 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:50.209 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:50.209 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:50.467 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:50.726 [ 00:13:50.726 { 00:13:50.726 "name": "BaseBdev2", 00:13:50.726 "aliases": [ 00:13:50.726 "d39038a6-b9d8-4244-a5ac-7aa40ad65b66" 00:13:50.726 ], 00:13:50.726 "product_name": "Malloc disk", 00:13:50.726 "block_size": 512, 00:13:50.726 "num_blocks": 65536, 00:13:50.726 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:50.726 "assigned_rate_limits": { 00:13:50.726 "rw_ios_per_sec": 0, 00:13:50.726 "rw_mbytes_per_sec": 0, 00:13:50.726 "r_mbytes_per_sec": 0, 00:13:50.726 "w_mbytes_per_sec": 0 00:13:50.726 }, 00:13:50.726 "claimed": false, 00:13:50.726 "zoned": false, 00:13:50.726 "supported_io_types": { 00:13:50.726 "read": true, 00:13:50.726 "write": true, 00:13:50.726 "unmap": true, 00:13:50.726 "flush": true, 00:13:50.726 "reset": true, 00:13:50.726 "nvme_admin": false, 00:13:50.726 "nvme_io": false, 00:13:50.726 "nvme_io_md": false, 00:13:50.726 "write_zeroes": true, 00:13:50.726 "zcopy": true, 00:13:50.726 "get_zone_info": false, 00:13:50.726 "zone_management": false, 00:13:50.726 "zone_append": false, 00:13:50.726 "compare": false, 00:13:50.726 "compare_and_write": false, 00:13:50.726 "abort": true, 00:13:50.726 "seek_hole": false, 00:13:50.726 "seek_data": false, 00:13:50.726 "copy": true, 00:13:50.726 "nvme_iov_md": false 00:13:50.726 }, 00:13:50.726 "memory_domains": [ 00:13:50.726 { 00:13:50.726 "dma_device_id": "system", 00:13:50.726 "dma_device_type": 1 00:13:50.726 }, 00:13:50.726 { 00:13:50.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.726 "dma_device_type": 2 00:13:50.726 } 00:13:50.726 ], 00:13:50.726 "driver_specific": {} 00:13:50.726 } 00:13:50.726 ] 00:13:50.726 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:50.726 09:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:50.726 09:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:50.726 09:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:50.985 BaseBdev3 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:50.985 09:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:51.243 09:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:51.502 [ 00:13:51.502 { 00:13:51.502 "name": "BaseBdev3", 00:13:51.502 "aliases": [ 00:13:51.502 "7fb2c722-0397-4dc5-845e-0e157e76cda5" 00:13:51.502 ], 00:13:51.502 "product_name": "Malloc disk", 00:13:51.502 "block_size": 512, 00:13:51.502 "num_blocks": 65536, 00:13:51.502 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:51.502 "assigned_rate_limits": { 00:13:51.502 "rw_ios_per_sec": 0, 00:13:51.502 "rw_mbytes_per_sec": 0, 00:13:51.502 "r_mbytes_per_sec": 0, 00:13:51.502 "w_mbytes_per_sec": 0 00:13:51.502 }, 00:13:51.502 "claimed": false, 00:13:51.502 "zoned": false, 00:13:51.502 "supported_io_types": { 00:13:51.502 "read": true, 00:13:51.502 "write": true, 00:13:51.502 "unmap": true, 00:13:51.502 "flush": true, 00:13:51.502 "reset": true, 00:13:51.502 "nvme_admin": false, 00:13:51.502 "nvme_io": false, 00:13:51.502 "nvme_io_md": false, 00:13:51.502 "write_zeroes": true, 00:13:51.502 "zcopy": true, 00:13:51.502 "get_zone_info": false, 00:13:51.502 "zone_management": false, 00:13:51.502 "zone_append": false, 00:13:51.502 "compare": false, 00:13:51.502 "compare_and_write": false, 00:13:51.502 "abort": true, 00:13:51.502 "seek_hole": false, 00:13:51.502 "seek_data": false, 00:13:51.502 "copy": true, 00:13:51.502 "nvme_iov_md": false 00:13:51.502 }, 00:13:51.502 "memory_domains": [ 00:13:51.502 { 00:13:51.502 "dma_device_id": "system", 00:13:51.502 "dma_device_type": 1 00:13:51.502 }, 00:13:51.502 { 00:13:51.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.502 "dma_device_type": 2 00:13:51.502 } 00:13:51.502 ], 00:13:51.502 "driver_specific": {} 00:13:51.502 } 00:13:51.502 ] 00:13:51.502 09:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:51.502 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:51.502 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:51.502 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:51.760 [2024-07-15 09:18:00.537950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:51.760 [2024-07-15 09:18:00.537993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:51.760 [2024-07-15 09:18:00.538014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:51.760 [2024-07-15 09:18:00.539389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.760 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.018 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.018 "name": "Existed_Raid", 00:13:52.018 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:52.018 "strip_size_kb": 64, 00:13:52.018 "state": "configuring", 00:13:52.018 "raid_level": "raid0", 00:13:52.018 "superblock": true, 00:13:52.018 "num_base_bdevs": 3, 00:13:52.018 "num_base_bdevs_discovered": 2, 00:13:52.018 "num_base_bdevs_operational": 3, 00:13:52.018 "base_bdevs_list": [ 00:13:52.018 { 00:13:52.018 "name": "BaseBdev1", 00:13:52.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.018 "is_configured": false, 00:13:52.018 "data_offset": 0, 00:13:52.018 "data_size": 0 00:13:52.018 }, 00:13:52.018 { 00:13:52.018 "name": "BaseBdev2", 00:13:52.018 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:52.018 "is_configured": true, 00:13:52.018 "data_offset": 2048, 00:13:52.018 "data_size": 63488 00:13:52.018 }, 00:13:52.018 { 00:13:52.018 "name": "BaseBdev3", 00:13:52.018 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:52.018 "is_configured": true, 00:13:52.018 "data_offset": 2048, 00:13:52.018 "data_size": 63488 00:13:52.018 } 00:13:52.018 ] 00:13:52.018 }' 00:13:52.018 09:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.018 09:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:52.582 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:52.838 [2024-07-15 09:18:01.552586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.838 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.128 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.128 "name": "Existed_Raid", 00:13:53.128 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:53.128 "strip_size_kb": 64, 00:13:53.128 "state": "configuring", 00:13:53.128 "raid_level": "raid0", 00:13:53.128 "superblock": true, 00:13:53.128 "num_base_bdevs": 3, 00:13:53.128 "num_base_bdevs_discovered": 1, 00:13:53.128 "num_base_bdevs_operational": 3, 00:13:53.128 "base_bdevs_list": [ 00:13:53.128 { 00:13:53.128 "name": "BaseBdev1", 00:13:53.128 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.128 "is_configured": false, 00:13:53.128 "data_offset": 0, 00:13:53.128 "data_size": 0 00:13:53.128 }, 00:13:53.128 { 00:13:53.128 "name": null, 00:13:53.128 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:53.128 "is_configured": false, 00:13:53.128 "data_offset": 2048, 00:13:53.128 "data_size": 63488 00:13:53.128 }, 00:13:53.128 { 00:13:53.128 "name": "BaseBdev3", 00:13:53.128 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:53.128 "is_configured": true, 00:13:53.128 "data_offset": 2048, 00:13:53.128 "data_size": 63488 00:13:53.128 } 00:13:53.128 ] 00:13:53.128 }' 00:13:53.128 09:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.128 09:18:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.693 09:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.693 09:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:53.949 09:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:53.949 09:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:54.207 [2024-07-15 09:18:02.904123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:54.207 BaseBdev1 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:54.207 09:18:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.207 09:18:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:54.464 [ 00:13:54.464 { 00:13:54.464 "name": "BaseBdev1", 00:13:54.464 "aliases": [ 00:13:54.464 "36eb3de4-2bc2-408c-81c8-40e021b0f5a0" 00:13:54.464 ], 00:13:54.464 "product_name": "Malloc disk", 00:13:54.464 "block_size": 512, 00:13:54.464 "num_blocks": 65536, 00:13:54.464 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:13:54.464 "assigned_rate_limits": { 00:13:54.464 "rw_ios_per_sec": 0, 00:13:54.464 "rw_mbytes_per_sec": 0, 00:13:54.464 "r_mbytes_per_sec": 0, 00:13:54.464 "w_mbytes_per_sec": 0 00:13:54.464 }, 00:13:54.464 "claimed": true, 00:13:54.464 "claim_type": "exclusive_write", 00:13:54.464 "zoned": false, 00:13:54.464 "supported_io_types": { 00:13:54.464 "read": true, 00:13:54.464 "write": true, 00:13:54.464 "unmap": true, 00:13:54.464 "flush": true, 00:13:54.464 "reset": true, 00:13:54.464 "nvme_admin": false, 00:13:54.464 "nvme_io": false, 00:13:54.464 "nvme_io_md": false, 00:13:54.464 "write_zeroes": true, 00:13:54.464 "zcopy": true, 00:13:54.464 "get_zone_info": false, 00:13:54.464 "zone_management": false, 00:13:54.464 "zone_append": false, 00:13:54.464 "compare": false, 00:13:54.464 "compare_and_write": false, 00:13:54.464 "abort": true, 00:13:54.464 "seek_hole": false, 00:13:54.464 "seek_data": false, 00:13:54.464 "copy": true, 00:13:54.464 "nvme_iov_md": false 00:13:54.464 }, 00:13:54.464 "memory_domains": [ 00:13:54.464 { 00:13:54.464 "dma_device_id": "system", 00:13:54.464 "dma_device_type": 1 00:13:54.464 }, 00:13:54.464 { 00:13:54.464 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.464 "dma_device_type": 2 00:13:54.464 } 00:13:54.464 ], 00:13:54.464 "driver_specific": {} 00:13:54.464 } 00:13:54.464 ] 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.464 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.465 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.465 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.465 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.722 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.722 "name": "Existed_Raid", 00:13:54.722 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:54.722 "strip_size_kb": 64, 00:13:54.722 "state": "configuring", 00:13:54.722 "raid_level": "raid0", 00:13:54.722 "superblock": true, 00:13:54.722 "num_base_bdevs": 3, 00:13:54.722 "num_base_bdevs_discovered": 2, 00:13:54.722 "num_base_bdevs_operational": 3, 00:13:54.722 "base_bdevs_list": [ 00:13:54.722 { 00:13:54.722 "name": "BaseBdev1", 00:13:54.722 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:13:54.722 "is_configured": true, 00:13:54.722 "data_offset": 2048, 00:13:54.722 "data_size": 63488 00:13:54.722 }, 00:13:54.722 { 00:13:54.722 "name": null, 00:13:54.722 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:54.722 "is_configured": false, 00:13:54.722 "data_offset": 2048, 00:13:54.722 "data_size": 63488 00:13:54.722 }, 00:13:54.722 { 00:13:54.722 "name": "BaseBdev3", 00:13:54.722 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:54.722 "is_configured": true, 00:13:54.722 "data_offset": 2048, 00:13:54.722 "data_size": 63488 00:13:54.722 } 00:13:54.722 ] 00:13:54.722 }' 00:13:54.722 09:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.722 09:18:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.288 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.288 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:55.548 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:55.548 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:55.806 [2024-07-15 09:18:04.596631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.806 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.065 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.065 "name": "Existed_Raid", 00:13:56.065 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:56.065 "strip_size_kb": 64, 00:13:56.065 "state": "configuring", 00:13:56.065 "raid_level": "raid0", 00:13:56.065 "superblock": true, 00:13:56.065 "num_base_bdevs": 3, 00:13:56.065 "num_base_bdevs_discovered": 1, 00:13:56.065 "num_base_bdevs_operational": 3, 00:13:56.065 "base_bdevs_list": [ 00:13:56.065 { 00:13:56.065 "name": "BaseBdev1", 00:13:56.065 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:13:56.065 "is_configured": true, 00:13:56.065 "data_offset": 2048, 00:13:56.065 "data_size": 63488 00:13:56.065 }, 00:13:56.065 { 00:13:56.065 "name": null, 00:13:56.065 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:56.065 "is_configured": false, 00:13:56.065 "data_offset": 2048, 00:13:56.065 "data_size": 63488 00:13:56.065 }, 00:13:56.065 { 00:13:56.065 "name": null, 00:13:56.065 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:56.065 "is_configured": false, 00:13:56.065 "data_offset": 2048, 00:13:56.065 "data_size": 63488 00:13:56.065 } 00:13:56.065 ] 00:13:56.065 }' 00:13:56.065 09:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.065 09:18:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.631 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:56.631 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.889 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:56.889 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:57.147 [2024-07-15 09:18:05.940238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.147 09:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.405 09:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.405 "name": "Existed_Raid", 00:13:57.405 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:57.405 "strip_size_kb": 64, 00:13:57.405 "state": "configuring", 00:13:57.405 "raid_level": "raid0", 00:13:57.405 "superblock": true, 00:13:57.405 "num_base_bdevs": 3, 00:13:57.405 "num_base_bdevs_discovered": 2, 00:13:57.405 "num_base_bdevs_operational": 3, 00:13:57.405 "base_bdevs_list": [ 00:13:57.405 { 00:13:57.405 "name": "BaseBdev1", 00:13:57.405 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:13:57.405 "is_configured": true, 00:13:57.405 "data_offset": 2048, 00:13:57.405 "data_size": 63488 00:13:57.405 }, 00:13:57.405 { 00:13:57.405 "name": null, 00:13:57.405 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:57.405 "is_configured": false, 00:13:57.405 "data_offset": 2048, 00:13:57.405 "data_size": 63488 00:13:57.405 }, 00:13:57.405 { 00:13:57.405 "name": "BaseBdev3", 00:13:57.405 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:57.405 "is_configured": true, 00:13:57.405 "data_offset": 2048, 00:13:57.405 "data_size": 63488 00:13:57.405 } 00:13:57.405 ] 00:13:57.405 }' 00:13:57.405 09:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.405 09:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.971 09:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.971 09:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:58.261 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:58.261 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:58.519 [2024-07-15 09:18:07.267780] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.519 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.778 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.778 "name": "Existed_Raid", 00:13:58.778 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:13:58.778 "strip_size_kb": 64, 00:13:58.778 "state": "configuring", 00:13:58.778 "raid_level": "raid0", 00:13:58.778 "superblock": true, 00:13:58.778 "num_base_bdevs": 3, 00:13:58.778 "num_base_bdevs_discovered": 1, 00:13:58.778 "num_base_bdevs_operational": 3, 00:13:58.778 "base_bdevs_list": [ 00:13:58.778 { 00:13:58.778 "name": null, 00:13:58.778 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:13:58.778 "is_configured": false, 00:13:58.778 "data_offset": 2048, 00:13:58.778 "data_size": 63488 00:13:58.778 }, 00:13:58.778 { 00:13:58.778 "name": null, 00:13:58.778 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:13:58.778 "is_configured": false, 00:13:58.778 "data_offset": 2048, 00:13:58.778 "data_size": 63488 00:13:58.778 }, 00:13:58.778 { 00:13:58.778 "name": "BaseBdev3", 00:13:58.778 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:13:58.778 "is_configured": true, 00:13:58.778 "data_offset": 2048, 00:13:58.778 "data_size": 63488 00:13:58.778 } 00:13:58.778 ] 00:13:58.778 }' 00:13:58.778 09:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.778 09:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.344 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:59.344 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.602 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:59.602 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:59.860 [2024-07-15 09:18:08.625952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.860 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.118 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.118 "name": "Existed_Raid", 00:14:00.118 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:14:00.118 "strip_size_kb": 64, 00:14:00.118 "state": "configuring", 00:14:00.118 "raid_level": "raid0", 00:14:00.119 "superblock": true, 00:14:00.119 "num_base_bdevs": 3, 00:14:00.119 "num_base_bdevs_discovered": 2, 00:14:00.119 "num_base_bdevs_operational": 3, 00:14:00.119 "base_bdevs_list": [ 00:14:00.119 { 00:14:00.119 "name": null, 00:14:00.119 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:14:00.119 "is_configured": false, 00:14:00.119 "data_offset": 2048, 00:14:00.119 "data_size": 63488 00:14:00.119 }, 00:14:00.119 { 00:14:00.119 "name": "BaseBdev2", 00:14:00.119 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:14:00.119 "is_configured": true, 00:14:00.119 "data_offset": 2048, 00:14:00.119 "data_size": 63488 00:14:00.119 }, 00:14:00.119 { 00:14:00.119 "name": "BaseBdev3", 00:14:00.119 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:14:00.119 "is_configured": true, 00:14:00.119 "data_offset": 2048, 00:14:00.119 "data_size": 63488 00:14:00.119 } 00:14:00.119 ] 00:14:00.119 }' 00:14:00.119 09:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.119 09:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.684 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:00.684 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.941 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:00.941 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.941 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:00.941 09:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 36eb3de4-2bc2-408c-81c8-40e021b0f5a0 00:14:01.199 [2024-07-15 09:18:10.122270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:01.199 [2024-07-15 09:18:10.122417] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19cfe90 00:14:01.199 [2024-07-15 09:18:10.122430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:01.199 [2024-07-15 09:18:10.122601] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16d6940 00:14:01.199 [2024-07-15 09:18:10.122722] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19cfe90 00:14:01.199 [2024-07-15 09:18:10.122732] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19cfe90 00:14:01.199 [2024-07-15 09:18:10.122824] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:01.199 NewBaseBdev 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:01.199 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.457 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:01.715 [ 00:14:01.715 { 00:14:01.715 "name": "NewBaseBdev", 00:14:01.715 "aliases": [ 00:14:01.715 "36eb3de4-2bc2-408c-81c8-40e021b0f5a0" 00:14:01.715 ], 00:14:01.715 "product_name": "Malloc disk", 00:14:01.715 "block_size": 512, 00:14:01.715 "num_blocks": 65536, 00:14:01.715 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:14:01.715 "assigned_rate_limits": { 00:14:01.715 "rw_ios_per_sec": 0, 00:14:01.715 "rw_mbytes_per_sec": 0, 00:14:01.715 "r_mbytes_per_sec": 0, 00:14:01.715 "w_mbytes_per_sec": 0 00:14:01.715 }, 00:14:01.715 "claimed": true, 00:14:01.715 "claim_type": "exclusive_write", 00:14:01.715 "zoned": false, 00:14:01.715 "supported_io_types": { 00:14:01.715 "read": true, 00:14:01.715 "write": true, 00:14:01.715 "unmap": true, 00:14:01.715 "flush": true, 00:14:01.715 "reset": true, 00:14:01.715 "nvme_admin": false, 00:14:01.715 "nvme_io": false, 00:14:01.715 "nvme_io_md": false, 00:14:01.715 "write_zeroes": true, 00:14:01.715 "zcopy": true, 00:14:01.715 "get_zone_info": false, 00:14:01.715 "zone_management": false, 00:14:01.715 "zone_append": false, 00:14:01.715 "compare": false, 00:14:01.715 "compare_and_write": false, 00:14:01.715 "abort": true, 00:14:01.715 "seek_hole": false, 00:14:01.715 "seek_data": false, 00:14:01.715 "copy": true, 00:14:01.715 "nvme_iov_md": false 00:14:01.715 }, 00:14:01.715 "memory_domains": [ 00:14:01.715 { 00:14:01.715 "dma_device_id": "system", 00:14:01.715 "dma_device_type": 1 00:14:01.715 }, 00:14:01.715 { 00:14:01.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.715 "dma_device_type": 2 00:14:01.715 } 00:14:01.715 ], 00:14:01.715 "driver_specific": {} 00:14:01.715 } 00:14:01.715 ] 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.715 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.974 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.974 "name": "Existed_Raid", 00:14:01.974 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:14:01.974 "strip_size_kb": 64, 00:14:01.974 "state": "online", 00:14:01.974 "raid_level": "raid0", 00:14:01.974 "superblock": true, 00:14:01.974 "num_base_bdevs": 3, 00:14:01.974 "num_base_bdevs_discovered": 3, 00:14:01.974 "num_base_bdevs_operational": 3, 00:14:01.974 "base_bdevs_list": [ 00:14:01.974 { 00:14:01.974 "name": "NewBaseBdev", 00:14:01.974 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:14:01.974 "is_configured": true, 00:14:01.974 "data_offset": 2048, 00:14:01.974 "data_size": 63488 00:14:01.974 }, 00:14:01.974 { 00:14:01.974 "name": "BaseBdev2", 00:14:01.974 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:14:01.974 "is_configured": true, 00:14:01.974 "data_offset": 2048, 00:14:01.974 "data_size": 63488 00:14:01.974 }, 00:14:01.974 { 00:14:01.974 "name": "BaseBdev3", 00:14:01.974 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:14:01.974 "is_configured": true, 00:14:01.974 "data_offset": 2048, 00:14:01.974 "data_size": 63488 00:14:01.974 } 00:14:01.974 ] 00:14:01.974 }' 00:14:01.974 09:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.974 09:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:02.540 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:02.799 [2024-07-15 09:18:11.670701] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:02.799 "name": "Existed_Raid", 00:14:02.799 "aliases": [ 00:14:02.799 "8fec37bf-66b0-4592-9e83-1291499315d1" 00:14:02.799 ], 00:14:02.799 "product_name": "Raid Volume", 00:14:02.799 "block_size": 512, 00:14:02.799 "num_blocks": 190464, 00:14:02.799 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:14:02.799 "assigned_rate_limits": { 00:14:02.799 "rw_ios_per_sec": 0, 00:14:02.799 "rw_mbytes_per_sec": 0, 00:14:02.799 "r_mbytes_per_sec": 0, 00:14:02.799 "w_mbytes_per_sec": 0 00:14:02.799 }, 00:14:02.799 "claimed": false, 00:14:02.799 "zoned": false, 00:14:02.799 "supported_io_types": { 00:14:02.799 "read": true, 00:14:02.799 "write": true, 00:14:02.799 "unmap": true, 00:14:02.799 "flush": true, 00:14:02.799 "reset": true, 00:14:02.799 "nvme_admin": false, 00:14:02.799 "nvme_io": false, 00:14:02.799 "nvme_io_md": false, 00:14:02.799 "write_zeroes": true, 00:14:02.799 "zcopy": false, 00:14:02.799 "get_zone_info": false, 00:14:02.799 "zone_management": false, 00:14:02.799 "zone_append": false, 00:14:02.799 "compare": false, 00:14:02.799 "compare_and_write": false, 00:14:02.799 "abort": false, 00:14:02.799 "seek_hole": false, 00:14:02.799 "seek_data": false, 00:14:02.799 "copy": false, 00:14:02.799 "nvme_iov_md": false 00:14:02.799 }, 00:14:02.799 "memory_domains": [ 00:14:02.799 { 00:14:02.799 "dma_device_id": "system", 00:14:02.799 "dma_device_type": 1 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.799 "dma_device_type": 2 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "dma_device_id": "system", 00:14:02.799 "dma_device_type": 1 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.799 "dma_device_type": 2 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "dma_device_id": "system", 00:14:02.799 "dma_device_type": 1 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.799 "dma_device_type": 2 00:14:02.799 } 00:14:02.799 ], 00:14:02.799 "driver_specific": { 00:14:02.799 "raid": { 00:14:02.799 "uuid": "8fec37bf-66b0-4592-9e83-1291499315d1", 00:14:02.799 "strip_size_kb": 64, 00:14:02.799 "state": "online", 00:14:02.799 "raid_level": "raid0", 00:14:02.799 "superblock": true, 00:14:02.799 "num_base_bdevs": 3, 00:14:02.799 "num_base_bdevs_discovered": 3, 00:14:02.799 "num_base_bdevs_operational": 3, 00:14:02.799 "base_bdevs_list": [ 00:14:02.799 { 00:14:02.799 "name": "NewBaseBdev", 00:14:02.799 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:14:02.799 "is_configured": true, 00:14:02.799 "data_offset": 2048, 00:14:02.799 "data_size": 63488 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "name": "BaseBdev2", 00:14:02.799 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:14:02.799 "is_configured": true, 00:14:02.799 "data_offset": 2048, 00:14:02.799 "data_size": 63488 00:14:02.799 }, 00:14:02.799 { 00:14:02.799 "name": "BaseBdev3", 00:14:02.799 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:14:02.799 "is_configured": true, 00:14:02.799 "data_offset": 2048, 00:14:02.799 "data_size": 63488 00:14:02.799 } 00:14:02.799 ] 00:14:02.799 } 00:14:02.799 } 00:14:02.799 }' 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:02.799 BaseBdev2 00:14:02.799 BaseBdev3' 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:02.799 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.058 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.058 "name": "NewBaseBdev", 00:14:03.058 "aliases": [ 00:14:03.058 "36eb3de4-2bc2-408c-81c8-40e021b0f5a0" 00:14:03.058 ], 00:14:03.058 "product_name": "Malloc disk", 00:14:03.058 "block_size": 512, 00:14:03.058 "num_blocks": 65536, 00:14:03.058 "uuid": "36eb3de4-2bc2-408c-81c8-40e021b0f5a0", 00:14:03.058 "assigned_rate_limits": { 00:14:03.058 "rw_ios_per_sec": 0, 00:14:03.058 "rw_mbytes_per_sec": 0, 00:14:03.058 "r_mbytes_per_sec": 0, 00:14:03.058 "w_mbytes_per_sec": 0 00:14:03.058 }, 00:14:03.058 "claimed": true, 00:14:03.058 "claim_type": "exclusive_write", 00:14:03.058 "zoned": false, 00:14:03.058 "supported_io_types": { 00:14:03.058 "read": true, 00:14:03.058 "write": true, 00:14:03.058 "unmap": true, 00:14:03.058 "flush": true, 00:14:03.058 "reset": true, 00:14:03.058 "nvme_admin": false, 00:14:03.058 "nvme_io": false, 00:14:03.058 "nvme_io_md": false, 00:14:03.058 "write_zeroes": true, 00:14:03.058 "zcopy": true, 00:14:03.058 "get_zone_info": false, 00:14:03.058 "zone_management": false, 00:14:03.058 "zone_append": false, 00:14:03.058 "compare": false, 00:14:03.058 "compare_and_write": false, 00:14:03.058 "abort": true, 00:14:03.058 "seek_hole": false, 00:14:03.058 "seek_data": false, 00:14:03.058 "copy": true, 00:14:03.058 "nvme_iov_md": false 00:14:03.058 }, 00:14:03.058 "memory_domains": [ 00:14:03.058 { 00:14:03.058 "dma_device_id": "system", 00:14:03.058 "dma_device_type": 1 00:14:03.058 }, 00:14:03.058 { 00:14:03.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.058 "dma_device_type": 2 00:14:03.058 } 00:14:03.058 ], 00:14:03.058 "driver_specific": {} 00:14:03.058 }' 00:14:03.058 09:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.316 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.575 "name": "BaseBdev2", 00:14:03.575 "aliases": [ 00:14:03.575 "d39038a6-b9d8-4244-a5ac-7aa40ad65b66" 00:14:03.575 ], 00:14:03.575 "product_name": "Malloc disk", 00:14:03.575 "block_size": 512, 00:14:03.575 "num_blocks": 65536, 00:14:03.575 "uuid": "d39038a6-b9d8-4244-a5ac-7aa40ad65b66", 00:14:03.575 "assigned_rate_limits": { 00:14:03.575 "rw_ios_per_sec": 0, 00:14:03.575 "rw_mbytes_per_sec": 0, 00:14:03.575 "r_mbytes_per_sec": 0, 00:14:03.575 "w_mbytes_per_sec": 0 00:14:03.575 }, 00:14:03.575 "claimed": true, 00:14:03.575 "claim_type": "exclusive_write", 00:14:03.575 "zoned": false, 00:14:03.575 "supported_io_types": { 00:14:03.575 "read": true, 00:14:03.575 "write": true, 00:14:03.575 "unmap": true, 00:14:03.575 "flush": true, 00:14:03.575 "reset": true, 00:14:03.575 "nvme_admin": false, 00:14:03.575 "nvme_io": false, 00:14:03.575 "nvme_io_md": false, 00:14:03.575 "write_zeroes": true, 00:14:03.575 "zcopy": true, 00:14:03.575 "get_zone_info": false, 00:14:03.575 "zone_management": false, 00:14:03.575 "zone_append": false, 00:14:03.575 "compare": false, 00:14:03.575 "compare_and_write": false, 00:14:03.575 "abort": true, 00:14:03.575 "seek_hole": false, 00:14:03.575 "seek_data": false, 00:14:03.575 "copy": true, 00:14:03.575 "nvme_iov_md": false 00:14:03.575 }, 00:14:03.575 "memory_domains": [ 00:14:03.575 { 00:14:03.575 "dma_device_id": "system", 00:14:03.575 "dma_device_type": 1 00:14:03.575 }, 00:14:03.575 { 00:14:03.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.575 "dma_device_type": 2 00:14:03.575 } 00:14:03.575 ], 00:14:03.575 "driver_specific": {} 00:14:03.575 }' 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.575 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:03.833 09:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.092 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.092 "name": "BaseBdev3", 00:14:04.092 "aliases": [ 00:14:04.092 "7fb2c722-0397-4dc5-845e-0e157e76cda5" 00:14:04.092 ], 00:14:04.092 "product_name": "Malloc disk", 00:14:04.092 "block_size": 512, 00:14:04.092 "num_blocks": 65536, 00:14:04.092 "uuid": "7fb2c722-0397-4dc5-845e-0e157e76cda5", 00:14:04.092 "assigned_rate_limits": { 00:14:04.092 "rw_ios_per_sec": 0, 00:14:04.092 "rw_mbytes_per_sec": 0, 00:14:04.092 "r_mbytes_per_sec": 0, 00:14:04.092 "w_mbytes_per_sec": 0 00:14:04.092 }, 00:14:04.092 "claimed": true, 00:14:04.092 "claim_type": "exclusive_write", 00:14:04.092 "zoned": false, 00:14:04.092 "supported_io_types": { 00:14:04.092 "read": true, 00:14:04.092 "write": true, 00:14:04.092 "unmap": true, 00:14:04.092 "flush": true, 00:14:04.092 "reset": true, 00:14:04.092 "nvme_admin": false, 00:14:04.092 "nvme_io": false, 00:14:04.092 "nvme_io_md": false, 00:14:04.092 "write_zeroes": true, 00:14:04.092 "zcopy": true, 00:14:04.092 "get_zone_info": false, 00:14:04.092 "zone_management": false, 00:14:04.092 "zone_append": false, 00:14:04.092 "compare": false, 00:14:04.092 "compare_and_write": false, 00:14:04.092 "abort": true, 00:14:04.092 "seek_hole": false, 00:14:04.092 "seek_data": false, 00:14:04.092 "copy": true, 00:14:04.092 "nvme_iov_md": false 00:14:04.092 }, 00:14:04.092 "memory_domains": [ 00:14:04.092 { 00:14:04.092 "dma_device_id": "system", 00:14:04.092 "dma_device_type": 1 00:14:04.092 }, 00:14:04.092 { 00:14:04.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.092 "dma_device_type": 2 00:14:04.092 } 00:14:04.092 ], 00:14:04.092 "driver_specific": {} 00:14:04.092 }' 00:14:04.092 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.350 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:04.608 [2024-07-15 09:18:13.479228] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:04.608 [2024-07-15 09:18:13.479251] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:04.608 [2024-07-15 09:18:13.479301] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:04.608 [2024-07-15 09:18:13.479352] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:04.608 [2024-07-15 09:18:13.479364] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19cfe90 name Existed_Raid, state offline 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 108859 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 108859 ']' 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 108859 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:04.608 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 108859 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 108859' 00:14:04.609 killing process with pid 108859 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 108859 00:14:04.609 [2024-07-15 09:18:13.548652] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:04.609 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 108859 00:14:04.866 [2024-07-15 09:18:13.576197] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:04.866 09:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:04.866 00:14:04.866 real 0m27.788s 00:14:04.866 user 0m50.888s 00:14:04.866 sys 0m5.069s 00:14:04.866 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:04.866 09:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:04.866 ************************************ 00:14:04.866 END TEST raid_state_function_test_sb 00:14:04.866 ************************************ 00:14:05.125 09:18:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:05.125 09:18:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:05.125 09:18:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:05.125 09:18:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:05.125 09:18:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:05.125 ************************************ 00:14:05.125 START TEST raid_superblock_test 00:14:05.125 ************************************ 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:05.125 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=113497 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 113497 /var/tmp/spdk-raid.sock 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 113497 ']' 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:05.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:05.126 09:18:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.126 [2024-07-15 09:18:13.942359] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:14:05.126 [2024-07-15 09:18:13.942422] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113497 ] 00:14:05.126 [2024-07-15 09:18:14.068754] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:05.384 [2024-07-15 09:18:14.173719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:05.384 [2024-07-15 09:18:14.233002] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:05.384 [2024-07-15 09:18:14.233027] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:05.950 09:18:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:05.951 09:18:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:06.208 malloc1 00:14:06.208 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:06.465 [2024-07-15 09:18:15.348166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:06.465 [2024-07-15 09:18:15.348215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:06.465 [2024-07-15 09:18:15.348236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19da570 00:14:06.465 [2024-07-15 09:18:15.348249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:06.465 [2024-07-15 09:18:15.350001] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:06.465 [2024-07-15 09:18:15.350029] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:06.465 pt1 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:06.465 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:06.721 malloc2 00:14:06.721 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:06.979 [2024-07-15 09:18:15.842250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:06.979 [2024-07-15 09:18:15.842297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:06.979 [2024-07-15 09:18:15.842315] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19db970 00:14:06.979 [2024-07-15 09:18:15.842327] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:06.979 [2024-07-15 09:18:15.843937] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:06.979 [2024-07-15 09:18:15.843965] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:06.979 pt2 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:06.979 09:18:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:07.236 malloc3 00:14:07.236 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:07.494 [2024-07-15 09:18:16.332777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:07.494 [2024-07-15 09:18:16.332824] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.494 [2024-07-15 09:18:16.332842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b72340 00:14:07.494 [2024-07-15 09:18:16.332855] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.494 [2024-07-15 09:18:16.334442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.494 [2024-07-15 09:18:16.334470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:07.494 pt3 00:14:07.494 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:07.494 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:07.494 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:07.752 [2024-07-15 09:18:16.581449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:07.752 [2024-07-15 09:18:16.582846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:07.752 [2024-07-15 09:18:16.582901] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:07.752 [2024-07-15 09:18:16.583063] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19d2ea0 00:14:07.752 [2024-07-15 09:18:16.583075] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:07.752 [2024-07-15 09:18:16.583279] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19da240 00:14:07.752 [2024-07-15 09:18:16.583421] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19d2ea0 00:14:07.752 [2024-07-15 09:18:16.583431] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19d2ea0 00:14:07.752 [2024-07-15 09:18:16.583529] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.752 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:08.010 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.010 "name": "raid_bdev1", 00:14:08.010 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:08.010 "strip_size_kb": 64, 00:14:08.010 "state": "online", 00:14:08.010 "raid_level": "raid0", 00:14:08.010 "superblock": true, 00:14:08.010 "num_base_bdevs": 3, 00:14:08.010 "num_base_bdevs_discovered": 3, 00:14:08.010 "num_base_bdevs_operational": 3, 00:14:08.010 "base_bdevs_list": [ 00:14:08.010 { 00:14:08.011 "name": "pt1", 00:14:08.011 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.011 "is_configured": true, 00:14:08.011 "data_offset": 2048, 00:14:08.011 "data_size": 63488 00:14:08.011 }, 00:14:08.011 { 00:14:08.011 "name": "pt2", 00:14:08.011 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.011 "is_configured": true, 00:14:08.011 "data_offset": 2048, 00:14:08.011 "data_size": 63488 00:14:08.011 }, 00:14:08.011 { 00:14:08.011 "name": "pt3", 00:14:08.011 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:08.011 "is_configured": true, 00:14:08.011 "data_offset": 2048, 00:14:08.011 "data_size": 63488 00:14:08.011 } 00:14:08.011 ] 00:14:08.011 }' 00:14:08.011 09:18:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.011 09:18:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:08.576 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:08.833 [2024-07-15 09:18:17.672573] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:08.833 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:08.833 "name": "raid_bdev1", 00:14:08.833 "aliases": [ 00:14:08.833 "fa3f66d2-0495-47d6-b28d-475700ea53e3" 00:14:08.833 ], 00:14:08.833 "product_name": "Raid Volume", 00:14:08.833 "block_size": 512, 00:14:08.833 "num_blocks": 190464, 00:14:08.833 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:08.833 "assigned_rate_limits": { 00:14:08.833 "rw_ios_per_sec": 0, 00:14:08.833 "rw_mbytes_per_sec": 0, 00:14:08.833 "r_mbytes_per_sec": 0, 00:14:08.833 "w_mbytes_per_sec": 0 00:14:08.833 }, 00:14:08.833 "claimed": false, 00:14:08.833 "zoned": false, 00:14:08.833 "supported_io_types": { 00:14:08.833 "read": true, 00:14:08.833 "write": true, 00:14:08.833 "unmap": true, 00:14:08.833 "flush": true, 00:14:08.833 "reset": true, 00:14:08.833 "nvme_admin": false, 00:14:08.833 "nvme_io": false, 00:14:08.833 "nvme_io_md": false, 00:14:08.833 "write_zeroes": true, 00:14:08.833 "zcopy": false, 00:14:08.833 "get_zone_info": false, 00:14:08.833 "zone_management": false, 00:14:08.833 "zone_append": false, 00:14:08.833 "compare": false, 00:14:08.833 "compare_and_write": false, 00:14:08.833 "abort": false, 00:14:08.833 "seek_hole": false, 00:14:08.833 "seek_data": false, 00:14:08.833 "copy": false, 00:14:08.834 "nvme_iov_md": false 00:14:08.834 }, 00:14:08.834 "memory_domains": [ 00:14:08.834 { 00:14:08.834 "dma_device_id": "system", 00:14:08.834 "dma_device_type": 1 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.834 "dma_device_type": 2 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "dma_device_id": "system", 00:14:08.834 "dma_device_type": 1 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.834 "dma_device_type": 2 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "dma_device_id": "system", 00:14:08.834 "dma_device_type": 1 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.834 "dma_device_type": 2 00:14:08.834 } 00:14:08.834 ], 00:14:08.834 "driver_specific": { 00:14:08.834 "raid": { 00:14:08.834 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:08.834 "strip_size_kb": 64, 00:14:08.834 "state": "online", 00:14:08.834 "raid_level": "raid0", 00:14:08.834 "superblock": true, 00:14:08.834 "num_base_bdevs": 3, 00:14:08.834 "num_base_bdevs_discovered": 3, 00:14:08.834 "num_base_bdevs_operational": 3, 00:14:08.834 "base_bdevs_list": [ 00:14:08.834 { 00:14:08.834 "name": "pt1", 00:14:08.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.834 "is_configured": true, 00:14:08.834 "data_offset": 2048, 00:14:08.834 "data_size": 63488 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "name": "pt2", 00:14:08.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.834 "is_configured": true, 00:14:08.834 "data_offset": 2048, 00:14:08.834 "data_size": 63488 00:14:08.834 }, 00:14:08.834 { 00:14:08.834 "name": "pt3", 00:14:08.834 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:08.834 "is_configured": true, 00:14:08.834 "data_offset": 2048, 00:14:08.834 "data_size": 63488 00:14:08.834 } 00:14:08.834 ] 00:14:08.834 } 00:14:08.834 } 00:14:08.834 }' 00:14:08.834 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:08.834 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:08.834 pt2 00:14:08.834 pt3' 00:14:08.834 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.834 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:08.834 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.092 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.092 "name": "pt1", 00:14:09.092 "aliases": [ 00:14:09.092 "00000000-0000-0000-0000-000000000001" 00:14:09.092 ], 00:14:09.092 "product_name": "passthru", 00:14:09.092 "block_size": 512, 00:14:09.092 "num_blocks": 65536, 00:14:09.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:09.092 "assigned_rate_limits": { 00:14:09.092 "rw_ios_per_sec": 0, 00:14:09.092 "rw_mbytes_per_sec": 0, 00:14:09.092 "r_mbytes_per_sec": 0, 00:14:09.092 "w_mbytes_per_sec": 0 00:14:09.092 }, 00:14:09.092 "claimed": true, 00:14:09.092 "claim_type": "exclusive_write", 00:14:09.092 "zoned": false, 00:14:09.092 "supported_io_types": { 00:14:09.092 "read": true, 00:14:09.092 "write": true, 00:14:09.092 "unmap": true, 00:14:09.092 "flush": true, 00:14:09.092 "reset": true, 00:14:09.092 "nvme_admin": false, 00:14:09.092 "nvme_io": false, 00:14:09.092 "nvme_io_md": false, 00:14:09.092 "write_zeroes": true, 00:14:09.092 "zcopy": true, 00:14:09.092 "get_zone_info": false, 00:14:09.092 "zone_management": false, 00:14:09.092 "zone_append": false, 00:14:09.092 "compare": false, 00:14:09.092 "compare_and_write": false, 00:14:09.092 "abort": true, 00:14:09.092 "seek_hole": false, 00:14:09.092 "seek_data": false, 00:14:09.092 "copy": true, 00:14:09.092 "nvme_iov_md": false 00:14:09.092 }, 00:14:09.092 "memory_domains": [ 00:14:09.092 { 00:14:09.092 "dma_device_id": "system", 00:14:09.092 "dma_device_type": 1 00:14:09.092 }, 00:14:09.092 { 00:14:09.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.092 "dma_device_type": 2 00:14:09.092 } 00:14:09.092 ], 00:14:09.092 "driver_specific": { 00:14:09.092 "passthru": { 00:14:09.092 "name": "pt1", 00:14:09.092 "base_bdev_name": "malloc1" 00:14:09.092 } 00:14:09.092 } 00:14:09.092 }' 00:14:09.092 09:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.092 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.349 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.350 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.350 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.607 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.607 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.607 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:09.607 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.865 "name": "pt2", 00:14:09.865 "aliases": [ 00:14:09.865 "00000000-0000-0000-0000-000000000002" 00:14:09.865 ], 00:14:09.865 "product_name": "passthru", 00:14:09.865 "block_size": 512, 00:14:09.865 "num_blocks": 65536, 00:14:09.865 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:09.865 "assigned_rate_limits": { 00:14:09.865 "rw_ios_per_sec": 0, 00:14:09.865 "rw_mbytes_per_sec": 0, 00:14:09.865 "r_mbytes_per_sec": 0, 00:14:09.865 "w_mbytes_per_sec": 0 00:14:09.865 }, 00:14:09.865 "claimed": true, 00:14:09.865 "claim_type": "exclusive_write", 00:14:09.865 "zoned": false, 00:14:09.865 "supported_io_types": { 00:14:09.865 "read": true, 00:14:09.865 "write": true, 00:14:09.865 "unmap": true, 00:14:09.865 "flush": true, 00:14:09.865 "reset": true, 00:14:09.865 "nvme_admin": false, 00:14:09.865 "nvme_io": false, 00:14:09.865 "nvme_io_md": false, 00:14:09.865 "write_zeroes": true, 00:14:09.865 "zcopy": true, 00:14:09.865 "get_zone_info": false, 00:14:09.865 "zone_management": false, 00:14:09.865 "zone_append": false, 00:14:09.865 "compare": false, 00:14:09.865 "compare_and_write": false, 00:14:09.865 "abort": true, 00:14:09.865 "seek_hole": false, 00:14:09.865 "seek_data": false, 00:14:09.865 "copy": true, 00:14:09.865 "nvme_iov_md": false 00:14:09.865 }, 00:14:09.865 "memory_domains": [ 00:14:09.865 { 00:14:09.865 "dma_device_id": "system", 00:14:09.865 "dma_device_type": 1 00:14:09.865 }, 00:14:09.865 { 00:14:09.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.865 "dma_device_type": 2 00:14:09.865 } 00:14:09.865 ], 00:14:09.865 "driver_specific": { 00:14:09.865 "passthru": { 00:14:09.865 "name": "pt2", 00:14:09.865 "base_bdev_name": "malloc2" 00:14:09.865 } 00:14:09.865 } 00:14:09.865 }' 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.865 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:10.123 09:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:10.380 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:10.380 "name": "pt3", 00:14:10.380 "aliases": [ 00:14:10.380 "00000000-0000-0000-0000-000000000003" 00:14:10.380 ], 00:14:10.380 "product_name": "passthru", 00:14:10.380 "block_size": 512, 00:14:10.380 "num_blocks": 65536, 00:14:10.380 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:10.380 "assigned_rate_limits": { 00:14:10.380 "rw_ios_per_sec": 0, 00:14:10.380 "rw_mbytes_per_sec": 0, 00:14:10.380 "r_mbytes_per_sec": 0, 00:14:10.380 "w_mbytes_per_sec": 0 00:14:10.380 }, 00:14:10.380 "claimed": true, 00:14:10.380 "claim_type": "exclusive_write", 00:14:10.380 "zoned": false, 00:14:10.380 "supported_io_types": { 00:14:10.380 "read": true, 00:14:10.380 "write": true, 00:14:10.380 "unmap": true, 00:14:10.380 "flush": true, 00:14:10.380 "reset": true, 00:14:10.380 "nvme_admin": false, 00:14:10.380 "nvme_io": false, 00:14:10.380 "nvme_io_md": false, 00:14:10.380 "write_zeroes": true, 00:14:10.380 "zcopy": true, 00:14:10.380 "get_zone_info": false, 00:14:10.380 "zone_management": false, 00:14:10.380 "zone_append": false, 00:14:10.380 "compare": false, 00:14:10.380 "compare_and_write": false, 00:14:10.380 "abort": true, 00:14:10.380 "seek_hole": false, 00:14:10.380 "seek_data": false, 00:14:10.380 "copy": true, 00:14:10.380 "nvme_iov_md": false 00:14:10.380 }, 00:14:10.380 "memory_domains": [ 00:14:10.380 { 00:14:10.380 "dma_device_id": "system", 00:14:10.380 "dma_device_type": 1 00:14:10.380 }, 00:14:10.380 { 00:14:10.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.380 "dma_device_type": 2 00:14:10.380 } 00:14:10.380 ], 00:14:10.380 "driver_specific": { 00:14:10.380 "passthru": { 00:14:10.380 "name": "pt3", 00:14:10.380 "base_bdev_name": "malloc3" 00:14:10.380 } 00:14:10.380 } 00:14:10.380 }' 00:14:10.380 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.380 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.380 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:10.380 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:10.638 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:10.895 [2024-07-15 09:18:19.709970] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:10.895 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fa3f66d2-0495-47d6-b28d-475700ea53e3 00:14:10.895 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fa3f66d2-0495-47d6-b28d-475700ea53e3 ']' 00:14:10.895 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:11.153 [2024-07-15 09:18:19.958360] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:11.153 [2024-07-15 09:18:19.958382] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:11.153 [2024-07-15 09:18:19.958434] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:11.153 [2024-07-15 09:18:19.958487] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:11.153 [2024-07-15 09:18:19.958499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d2ea0 name raid_bdev1, state offline 00:14:11.153 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.153 09:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:11.411 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:11.411 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:11.411 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:11.411 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:11.669 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:11.669 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:11.928 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:11.928 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:12.186 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:12.186 09:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:12.444 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:12.702 [2024-07-15 09:18:21.398132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:12.702 [2024-07-15 09:18:21.399475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:12.702 [2024-07-15 09:18:21.399517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:12.702 [2024-07-15 09:18:21.399562] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:12.702 [2024-07-15 09:18:21.399599] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:12.702 [2024-07-15 09:18:21.399622] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:12.702 [2024-07-15 09:18:21.399640] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:12.702 [2024-07-15 09:18:21.399651] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7dff0 name raid_bdev1, state configuring 00:14:12.702 request: 00:14:12.702 { 00:14:12.702 "name": "raid_bdev1", 00:14:12.702 "raid_level": "raid0", 00:14:12.702 "base_bdevs": [ 00:14:12.702 "malloc1", 00:14:12.702 "malloc2", 00:14:12.702 "malloc3" 00:14:12.702 ], 00:14:12.702 "strip_size_kb": 64, 00:14:12.702 "superblock": false, 00:14:12.702 "method": "bdev_raid_create", 00:14:12.702 "req_id": 1 00:14:12.702 } 00:14:12.702 Got JSON-RPC error response 00:14:12.702 response: 00:14:12.702 { 00:14:12.702 "code": -17, 00:14:12.702 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:12.702 } 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.702 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:12.961 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:12.961 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:12.961 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:12.961 [2024-07-15 09:18:21.895378] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:12.961 [2024-07-15 09:18:21.895415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:12.961 [2024-07-15 09:18:21.895435] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19da7a0 00:14:12.961 [2024-07-15 09:18:21.895448] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:12.961 [2024-07-15 09:18:21.896939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:12.961 [2024-07-15 09:18:21.896969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:12.961 [2024-07-15 09:18:21.897025] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:12.961 [2024-07-15 09:18:21.897050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:12.961 pt1 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.219 09:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:13.219 09:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.219 "name": "raid_bdev1", 00:14:13.219 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:13.219 "strip_size_kb": 64, 00:14:13.219 "state": "configuring", 00:14:13.219 "raid_level": "raid0", 00:14:13.219 "superblock": true, 00:14:13.219 "num_base_bdevs": 3, 00:14:13.219 "num_base_bdevs_discovered": 1, 00:14:13.219 "num_base_bdevs_operational": 3, 00:14:13.219 "base_bdevs_list": [ 00:14:13.219 { 00:14:13.219 "name": "pt1", 00:14:13.219 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:13.219 "is_configured": true, 00:14:13.219 "data_offset": 2048, 00:14:13.219 "data_size": 63488 00:14:13.219 }, 00:14:13.219 { 00:14:13.219 "name": null, 00:14:13.219 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.219 "is_configured": false, 00:14:13.219 "data_offset": 2048, 00:14:13.219 "data_size": 63488 00:14:13.219 }, 00:14:13.219 { 00:14:13.219 "name": null, 00:14:13.219 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:13.219 "is_configured": false, 00:14:13.219 "data_offset": 2048, 00:14:13.219 "data_size": 63488 00:14:13.219 } 00:14:13.219 ] 00:14:13.219 }' 00:14:13.219 09:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.219 09:18:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.149 09:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:14.149 09:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:14.149 [2024-07-15 09:18:22.986307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:14.149 [2024-07-15 09:18:22.986354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.149 [2024-07-15 09:18:22.986373] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19d1c70 00:14:14.149 [2024-07-15 09:18:22.986386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.149 [2024-07-15 09:18:22.986722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.149 [2024-07-15 09:18:22.986740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:14.149 [2024-07-15 09:18:22.986800] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:14.150 [2024-07-15 09:18:22.986819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:14.150 pt2 00:14:14.150 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:14.439 [2024-07-15 09:18:23.230966] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.439 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:14.721 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.721 "name": "raid_bdev1", 00:14:14.721 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:14.721 "strip_size_kb": 64, 00:14:14.721 "state": "configuring", 00:14:14.721 "raid_level": "raid0", 00:14:14.721 "superblock": true, 00:14:14.721 "num_base_bdevs": 3, 00:14:14.721 "num_base_bdevs_discovered": 1, 00:14:14.721 "num_base_bdevs_operational": 3, 00:14:14.721 "base_bdevs_list": [ 00:14:14.721 { 00:14:14.721 "name": "pt1", 00:14:14.721 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.721 "is_configured": true, 00:14:14.721 "data_offset": 2048, 00:14:14.721 "data_size": 63488 00:14:14.721 }, 00:14:14.721 { 00:14:14.721 "name": null, 00:14:14.721 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:14.721 "is_configured": false, 00:14:14.721 "data_offset": 2048, 00:14:14.721 "data_size": 63488 00:14:14.721 }, 00:14:14.721 { 00:14:14.721 "name": null, 00:14:14.721 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:14.721 "is_configured": false, 00:14:14.721 "data_offset": 2048, 00:14:14.721 "data_size": 63488 00:14:14.721 } 00:14:14.721 ] 00:14:14.721 }' 00:14:14.721 09:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.721 09:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.284 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:15.284 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:15.284 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:15.541 [2024-07-15 09:18:24.313831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:15.541 [2024-07-15 09:18:24.313880] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.541 [2024-07-15 09:18:24.313898] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b72fa0 00:14:15.541 [2024-07-15 09:18:24.313910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.541 [2024-07-15 09:18:24.314254] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.541 [2024-07-15 09:18:24.314272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:15.541 [2024-07-15 09:18:24.314334] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:15.541 [2024-07-15 09:18:24.314352] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:15.541 pt2 00:14:15.541 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:15.541 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:15.541 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:15.798 [2024-07-15 09:18:24.558488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:15.798 [2024-07-15 09:18:24.558524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.798 [2024-07-15 09:18:24.558542] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b73b30 00:14:15.798 [2024-07-15 09:18:24.558554] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.798 [2024-07-15 09:18:24.558838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.798 [2024-07-15 09:18:24.558856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:15.798 [2024-07-15 09:18:24.558906] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:15.798 [2024-07-15 09:18:24.558923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:15.798 [2024-07-15 09:18:24.559062] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b74c00 00:14:15.798 [2024-07-15 09:18:24.559077] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:15.798 [2024-07-15 09:18:24.559249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7d9b0 00:14:15.798 [2024-07-15 09:18:24.559373] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b74c00 00:14:15.798 [2024-07-15 09:18:24.559383] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b74c00 00:14:15.798 [2024-07-15 09:18:24.559478] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:15.798 pt3 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.798 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.055 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.055 "name": "raid_bdev1", 00:14:16.055 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:16.055 "strip_size_kb": 64, 00:14:16.055 "state": "online", 00:14:16.055 "raid_level": "raid0", 00:14:16.055 "superblock": true, 00:14:16.055 "num_base_bdevs": 3, 00:14:16.055 "num_base_bdevs_discovered": 3, 00:14:16.055 "num_base_bdevs_operational": 3, 00:14:16.055 "base_bdevs_list": [ 00:14:16.056 { 00:14:16.056 "name": "pt1", 00:14:16.056 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.056 "is_configured": true, 00:14:16.056 "data_offset": 2048, 00:14:16.056 "data_size": 63488 00:14:16.056 }, 00:14:16.056 { 00:14:16.056 "name": "pt2", 00:14:16.056 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.056 "is_configured": true, 00:14:16.056 "data_offset": 2048, 00:14:16.056 "data_size": 63488 00:14:16.056 }, 00:14:16.056 { 00:14:16.056 "name": "pt3", 00:14:16.056 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.056 "is_configured": true, 00:14:16.056 "data_offset": 2048, 00:14:16.056 "data_size": 63488 00:14:16.056 } 00:14:16.056 ] 00:14:16.056 }' 00:14:16.056 09:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.056 09:18:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:16.622 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:16.880 [2024-07-15 09:18:25.625588] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:16.880 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:16.880 "name": "raid_bdev1", 00:14:16.880 "aliases": [ 00:14:16.880 "fa3f66d2-0495-47d6-b28d-475700ea53e3" 00:14:16.880 ], 00:14:16.881 "product_name": "Raid Volume", 00:14:16.881 "block_size": 512, 00:14:16.881 "num_blocks": 190464, 00:14:16.881 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:16.881 "assigned_rate_limits": { 00:14:16.881 "rw_ios_per_sec": 0, 00:14:16.881 "rw_mbytes_per_sec": 0, 00:14:16.881 "r_mbytes_per_sec": 0, 00:14:16.881 "w_mbytes_per_sec": 0 00:14:16.881 }, 00:14:16.881 "claimed": false, 00:14:16.881 "zoned": false, 00:14:16.881 "supported_io_types": { 00:14:16.881 "read": true, 00:14:16.881 "write": true, 00:14:16.881 "unmap": true, 00:14:16.881 "flush": true, 00:14:16.881 "reset": true, 00:14:16.881 "nvme_admin": false, 00:14:16.881 "nvme_io": false, 00:14:16.881 "nvme_io_md": false, 00:14:16.881 "write_zeroes": true, 00:14:16.881 "zcopy": false, 00:14:16.881 "get_zone_info": false, 00:14:16.881 "zone_management": false, 00:14:16.881 "zone_append": false, 00:14:16.881 "compare": false, 00:14:16.881 "compare_and_write": false, 00:14:16.881 "abort": false, 00:14:16.881 "seek_hole": false, 00:14:16.881 "seek_data": false, 00:14:16.881 "copy": false, 00:14:16.881 "nvme_iov_md": false 00:14:16.881 }, 00:14:16.881 "memory_domains": [ 00:14:16.881 { 00:14:16.881 "dma_device_id": "system", 00:14:16.881 "dma_device_type": 1 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.881 "dma_device_type": 2 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "dma_device_id": "system", 00:14:16.881 "dma_device_type": 1 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.881 "dma_device_type": 2 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "dma_device_id": "system", 00:14:16.881 "dma_device_type": 1 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.881 "dma_device_type": 2 00:14:16.881 } 00:14:16.881 ], 00:14:16.881 "driver_specific": { 00:14:16.881 "raid": { 00:14:16.881 "uuid": "fa3f66d2-0495-47d6-b28d-475700ea53e3", 00:14:16.881 "strip_size_kb": 64, 00:14:16.881 "state": "online", 00:14:16.881 "raid_level": "raid0", 00:14:16.881 "superblock": true, 00:14:16.881 "num_base_bdevs": 3, 00:14:16.881 "num_base_bdevs_discovered": 3, 00:14:16.881 "num_base_bdevs_operational": 3, 00:14:16.881 "base_bdevs_list": [ 00:14:16.881 { 00:14:16.881 "name": "pt1", 00:14:16.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.881 "is_configured": true, 00:14:16.881 "data_offset": 2048, 00:14:16.881 "data_size": 63488 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "name": "pt2", 00:14:16.881 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.881 "is_configured": true, 00:14:16.881 "data_offset": 2048, 00:14:16.881 "data_size": 63488 00:14:16.881 }, 00:14:16.881 { 00:14:16.881 "name": "pt3", 00:14:16.881 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.881 "is_configured": true, 00:14:16.881 "data_offset": 2048, 00:14:16.881 "data_size": 63488 00:14:16.881 } 00:14:16.881 ] 00:14:16.881 } 00:14:16.881 } 00:14:16.881 }' 00:14:16.881 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:16.881 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:16.881 pt2 00:14:16.881 pt3' 00:14:16.881 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.881 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:16.881 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.139 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.139 "name": "pt1", 00:14:17.139 "aliases": [ 00:14:17.139 "00000000-0000-0000-0000-000000000001" 00:14:17.139 ], 00:14:17.139 "product_name": "passthru", 00:14:17.139 "block_size": 512, 00:14:17.139 "num_blocks": 65536, 00:14:17.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.139 "assigned_rate_limits": { 00:14:17.139 "rw_ios_per_sec": 0, 00:14:17.139 "rw_mbytes_per_sec": 0, 00:14:17.139 "r_mbytes_per_sec": 0, 00:14:17.139 "w_mbytes_per_sec": 0 00:14:17.139 }, 00:14:17.139 "claimed": true, 00:14:17.139 "claim_type": "exclusive_write", 00:14:17.139 "zoned": false, 00:14:17.139 "supported_io_types": { 00:14:17.139 "read": true, 00:14:17.139 "write": true, 00:14:17.139 "unmap": true, 00:14:17.139 "flush": true, 00:14:17.139 "reset": true, 00:14:17.139 "nvme_admin": false, 00:14:17.139 "nvme_io": false, 00:14:17.139 "nvme_io_md": false, 00:14:17.139 "write_zeroes": true, 00:14:17.139 "zcopy": true, 00:14:17.139 "get_zone_info": false, 00:14:17.139 "zone_management": false, 00:14:17.139 "zone_append": false, 00:14:17.139 "compare": false, 00:14:17.139 "compare_and_write": false, 00:14:17.139 "abort": true, 00:14:17.139 "seek_hole": false, 00:14:17.139 "seek_data": false, 00:14:17.139 "copy": true, 00:14:17.139 "nvme_iov_md": false 00:14:17.139 }, 00:14:17.139 "memory_domains": [ 00:14:17.139 { 00:14:17.139 "dma_device_id": "system", 00:14:17.140 "dma_device_type": 1 00:14:17.140 }, 00:14:17.140 { 00:14:17.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.140 "dma_device_type": 2 00:14:17.140 } 00:14:17.140 ], 00:14:17.140 "driver_specific": { 00:14:17.140 "passthru": { 00:14:17.140 "name": "pt1", 00:14:17.140 "base_bdev_name": "malloc1" 00:14:17.140 } 00:14:17.140 } 00:14:17.140 }' 00:14:17.140 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.140 09:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.140 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.140 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.140 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:17.398 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.656 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.656 "name": "pt2", 00:14:17.656 "aliases": [ 00:14:17.656 "00000000-0000-0000-0000-000000000002" 00:14:17.656 ], 00:14:17.656 "product_name": "passthru", 00:14:17.656 "block_size": 512, 00:14:17.656 "num_blocks": 65536, 00:14:17.656 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.656 "assigned_rate_limits": { 00:14:17.656 "rw_ios_per_sec": 0, 00:14:17.656 "rw_mbytes_per_sec": 0, 00:14:17.656 "r_mbytes_per_sec": 0, 00:14:17.656 "w_mbytes_per_sec": 0 00:14:17.656 }, 00:14:17.656 "claimed": true, 00:14:17.656 "claim_type": "exclusive_write", 00:14:17.656 "zoned": false, 00:14:17.656 "supported_io_types": { 00:14:17.656 "read": true, 00:14:17.656 "write": true, 00:14:17.656 "unmap": true, 00:14:17.656 "flush": true, 00:14:17.656 "reset": true, 00:14:17.656 "nvme_admin": false, 00:14:17.656 "nvme_io": false, 00:14:17.656 "nvme_io_md": false, 00:14:17.656 "write_zeroes": true, 00:14:17.656 "zcopy": true, 00:14:17.656 "get_zone_info": false, 00:14:17.656 "zone_management": false, 00:14:17.656 "zone_append": false, 00:14:17.656 "compare": false, 00:14:17.656 "compare_and_write": false, 00:14:17.656 "abort": true, 00:14:17.656 "seek_hole": false, 00:14:17.656 "seek_data": false, 00:14:17.656 "copy": true, 00:14:17.656 "nvme_iov_md": false 00:14:17.656 }, 00:14:17.656 "memory_domains": [ 00:14:17.656 { 00:14:17.656 "dma_device_id": "system", 00:14:17.656 "dma_device_type": 1 00:14:17.656 }, 00:14:17.656 { 00:14:17.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.656 "dma_device_type": 2 00:14:17.656 } 00:14:17.656 ], 00:14:17.656 "driver_specific": { 00:14:17.656 "passthru": { 00:14:17.656 "name": "pt2", 00:14:17.656 "base_bdev_name": "malloc2" 00:14:17.656 } 00:14:17.656 } 00:14:17.656 }' 00:14:17.656 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.656 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.656 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.656 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.914 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.172 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:18.172 09:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.172 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.172 "name": "pt3", 00:14:18.172 "aliases": [ 00:14:18.172 "00000000-0000-0000-0000-000000000003" 00:14:18.172 ], 00:14:18.172 "product_name": "passthru", 00:14:18.172 "block_size": 512, 00:14:18.172 "num_blocks": 65536, 00:14:18.172 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:18.172 "assigned_rate_limits": { 00:14:18.172 "rw_ios_per_sec": 0, 00:14:18.172 "rw_mbytes_per_sec": 0, 00:14:18.172 "r_mbytes_per_sec": 0, 00:14:18.172 "w_mbytes_per_sec": 0 00:14:18.172 }, 00:14:18.172 "claimed": true, 00:14:18.172 "claim_type": "exclusive_write", 00:14:18.172 "zoned": false, 00:14:18.172 "supported_io_types": { 00:14:18.172 "read": true, 00:14:18.172 "write": true, 00:14:18.172 "unmap": true, 00:14:18.172 "flush": true, 00:14:18.172 "reset": true, 00:14:18.172 "nvme_admin": false, 00:14:18.172 "nvme_io": false, 00:14:18.172 "nvme_io_md": false, 00:14:18.172 "write_zeroes": true, 00:14:18.172 "zcopy": true, 00:14:18.172 "get_zone_info": false, 00:14:18.172 "zone_management": false, 00:14:18.172 "zone_append": false, 00:14:18.172 "compare": false, 00:14:18.172 "compare_and_write": false, 00:14:18.172 "abort": true, 00:14:18.172 "seek_hole": false, 00:14:18.172 "seek_data": false, 00:14:18.172 "copy": true, 00:14:18.172 "nvme_iov_md": false 00:14:18.172 }, 00:14:18.172 "memory_domains": [ 00:14:18.172 { 00:14:18.172 "dma_device_id": "system", 00:14:18.172 "dma_device_type": 1 00:14:18.172 }, 00:14:18.172 { 00:14:18.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.172 "dma_device_type": 2 00:14:18.172 } 00:14:18.172 ], 00:14:18.172 "driver_specific": { 00:14:18.172 "passthru": { 00:14:18.172 "name": "pt3", 00:14:18.172 "base_bdev_name": "malloc3" 00:14:18.172 } 00:14:18.172 } 00:14:18.172 }' 00:14:18.172 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.432 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.690 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.690 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.690 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:18.690 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:18.948 [2024-07-15 09:18:27.658977] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fa3f66d2-0495-47d6-b28d-475700ea53e3 '!=' fa3f66d2-0495-47d6-b28d-475700ea53e3 ']' 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 113497 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 113497 ']' 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 113497 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 113497 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 113497' 00:14:18.948 killing process with pid 113497 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 113497 00:14:18.948 [2024-07-15 09:18:27.714444] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:18.948 [2024-07-15 09:18:27.714498] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:18.948 [2024-07-15 09:18:27.714551] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:18.948 [2024-07-15 09:18:27.714563] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b74c00 name raid_bdev1, state offline 00:14:18.948 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 113497 00:14:18.948 [2024-07-15 09:18:27.746025] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:19.207 09:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:19.207 00:14:19.207 real 0m14.089s 00:14:19.207 user 0m25.380s 00:14:19.207 sys 0m2.515s 00:14:19.207 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:19.207 09:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.207 ************************************ 00:14:19.207 END TEST raid_superblock_test 00:14:19.207 ************************************ 00:14:19.207 09:18:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:19.207 09:18:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:19.207 09:18:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:19.207 09:18:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:19.207 09:18:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:19.207 ************************************ 00:14:19.207 START TEST raid_read_error_test 00:14:19.207 ************************************ 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.rlmkzd8BOp 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=115710 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 115710 /var/tmp/spdk-raid.sock 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 115710 ']' 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:19.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:19.207 09:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.207 [2024-07-15 09:18:28.125327] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:14:19.207 [2024-07-15 09:18:28.125394] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid115710 ] 00:14:19.466 [2024-07-15 09:18:28.255636] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:19.466 [2024-07-15 09:18:28.363182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:19.724 [2024-07-15 09:18:28.431721] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.724 [2024-07-15 09:18:28.431765] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:20.290 09:18:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:20.290 09:18:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:20.290 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:20.290 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:20.547 BaseBdev1_malloc 00:14:20.547 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:20.806 true 00:14:20.806 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:21.064 [2024-07-15 09:18:29.771170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:21.064 [2024-07-15 09:18:29.771214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.064 [2024-07-15 09:18:29.771237] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x280d0d0 00:14:21.064 [2024-07-15 09:18:29.771250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.064 [2024-07-15 09:18:29.773146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.064 [2024-07-15 09:18:29.773175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:21.064 BaseBdev1 00:14:21.064 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:21.064 09:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:21.322 BaseBdev2_malloc 00:14:21.322 09:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:21.322 true 00:14:21.580 09:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:21.580 [2024-07-15 09:18:30.505884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:21.580 [2024-07-15 09:18:30.505936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.580 [2024-07-15 09:18:30.505957] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2811910 00:14:21.580 [2024-07-15 09:18:30.505970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.580 [2024-07-15 09:18:30.507529] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.580 [2024-07-15 09:18:30.507555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:21.580 BaseBdev2 00:14:21.580 09:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:21.580 09:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:21.838 BaseBdev3_malloc 00:14:21.838 09:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:22.097 true 00:14:22.097 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:22.355 [2024-07-15 09:18:31.244410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:22.355 [2024-07-15 09:18:31.244456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.355 [2024-07-15 09:18:31.244478] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2813bd0 00:14:22.355 [2024-07-15 09:18:31.244491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.355 [2024-07-15 09:18:31.246107] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.355 [2024-07-15 09:18:31.246134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:22.355 BaseBdev3 00:14:22.355 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:22.614 [2024-07-15 09:18:31.485084] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:22.614 [2024-07-15 09:18:31.486454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:22.614 [2024-07-15 09:18:31.486523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:22.614 [2024-07-15 09:18:31.486739] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2815280 00:14:22.614 [2024-07-15 09:18:31.486751] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:22.614 [2024-07-15 09:18:31.486960] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2814e20 00:14:22.614 [2024-07-15 09:18:31.487111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2815280 00:14:22.614 [2024-07-15 09:18:31.487121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2815280 00:14:22.614 [2024-07-15 09:18:31.487227] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.614 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.873 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.873 "name": "raid_bdev1", 00:14:22.873 "uuid": "8ddb517b-4c68-4670-b182-076c33a691bc", 00:14:22.873 "strip_size_kb": 64, 00:14:22.873 "state": "online", 00:14:22.873 "raid_level": "raid0", 00:14:22.873 "superblock": true, 00:14:22.873 "num_base_bdevs": 3, 00:14:22.873 "num_base_bdevs_discovered": 3, 00:14:22.873 "num_base_bdevs_operational": 3, 00:14:22.873 "base_bdevs_list": [ 00:14:22.873 { 00:14:22.873 "name": "BaseBdev1", 00:14:22.873 "uuid": "75f41f70-032c-5a67-8a38-10f8ccf3127d", 00:14:22.873 "is_configured": true, 00:14:22.873 "data_offset": 2048, 00:14:22.873 "data_size": 63488 00:14:22.873 }, 00:14:22.873 { 00:14:22.873 "name": "BaseBdev2", 00:14:22.873 "uuid": "945fc4fa-7cd3-532d-a80d-9e20aa1a975b", 00:14:22.873 "is_configured": true, 00:14:22.873 "data_offset": 2048, 00:14:22.873 "data_size": 63488 00:14:22.873 }, 00:14:22.873 { 00:14:22.873 "name": "BaseBdev3", 00:14:22.873 "uuid": "3af66582-fd8a-5c50-952e-96b5b7bb8716", 00:14:22.873 "is_configured": true, 00:14:22.873 "data_offset": 2048, 00:14:22.873 "data_size": 63488 00:14:22.873 } 00:14:22.873 ] 00:14:22.873 }' 00:14:22.873 09:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.873 09:18:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.440 09:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:23.440 09:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:23.698 [2024-07-15 09:18:32.443881] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26635b0 00:14:24.631 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.890 "name": "raid_bdev1", 00:14:24.890 "uuid": "8ddb517b-4c68-4670-b182-076c33a691bc", 00:14:24.890 "strip_size_kb": 64, 00:14:24.890 "state": "online", 00:14:24.890 "raid_level": "raid0", 00:14:24.890 "superblock": true, 00:14:24.890 "num_base_bdevs": 3, 00:14:24.890 "num_base_bdevs_discovered": 3, 00:14:24.890 "num_base_bdevs_operational": 3, 00:14:24.890 "base_bdevs_list": [ 00:14:24.890 { 00:14:24.890 "name": "BaseBdev1", 00:14:24.890 "uuid": "75f41f70-032c-5a67-8a38-10f8ccf3127d", 00:14:24.890 "is_configured": true, 00:14:24.890 "data_offset": 2048, 00:14:24.890 "data_size": 63488 00:14:24.890 }, 00:14:24.890 { 00:14:24.890 "name": "BaseBdev2", 00:14:24.890 "uuid": "945fc4fa-7cd3-532d-a80d-9e20aa1a975b", 00:14:24.890 "is_configured": true, 00:14:24.890 "data_offset": 2048, 00:14:24.890 "data_size": 63488 00:14:24.890 }, 00:14:24.890 { 00:14:24.890 "name": "BaseBdev3", 00:14:24.890 "uuid": "3af66582-fd8a-5c50-952e-96b5b7bb8716", 00:14:24.890 "is_configured": true, 00:14:24.890 "data_offset": 2048, 00:14:24.890 "data_size": 63488 00:14:24.890 } 00:14:24.890 ] 00:14:24.890 }' 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.890 09:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:25.824 [2024-07-15 09:18:34.655857] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:25.824 [2024-07-15 09:18:34.655894] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.824 [2024-07-15 09:18:34.659057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.824 [2024-07-15 09:18:34.659094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.824 [2024-07-15 09:18:34.659128] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.824 [2024-07-15 09:18:34.659139] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2815280 name raid_bdev1, state offline 00:14:25.824 0 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 115710 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 115710 ']' 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 115710 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 115710 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 115710' 00:14:25.824 killing process with pid 115710 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 115710 00:14:25.824 [2024-07-15 09:18:34.723510] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.824 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 115710 00:14:25.824 [2024-07-15 09:18:34.744602] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.rlmkzd8BOp 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:26.082 00:14:26.082 real 0m6.932s 00:14:26.082 user 0m11.000s 00:14:26.082 sys 0m1.208s 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:26.082 09:18:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.082 ************************************ 00:14:26.082 END TEST raid_read_error_test 00:14:26.082 ************************************ 00:14:26.082 09:18:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:26.082 09:18:35 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:26.082 09:18:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:26.082 09:18:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:26.082 09:18:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:26.365 ************************************ 00:14:26.365 START TEST raid_write_error_test 00:14:26.365 ************************************ 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.biKWQ0Eaql 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=116688 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 116688 /var/tmp/spdk-raid.sock 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 116688 ']' 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:26.365 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:26.365 09:18:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.365 [2024-07-15 09:18:35.149828] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:14:26.365 [2024-07-15 09:18:35.149906] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid116688 ] 00:14:26.365 [2024-07-15 09:18:35.280222] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:26.623 [2024-07-15 09:18:35.383824] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:26.623 [2024-07-15 09:18:35.439538] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.623 [2024-07-15 09:18:35.439566] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:27.190 09:18:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:27.190 09:18:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:27.190 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:27.190 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:27.448 BaseBdev1_malloc 00:14:27.448 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:27.706 true 00:14:27.707 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:27.965 [2024-07-15 09:18:36.803041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:27.965 [2024-07-15 09:18:36.803089] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:27.965 [2024-07-15 09:18:36.803109] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19860d0 00:14:27.965 [2024-07-15 09:18:36.803121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:27.965 [2024-07-15 09:18:36.804817] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:27.965 [2024-07-15 09:18:36.804845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:27.965 BaseBdev1 00:14:27.965 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:27.965 09:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:28.224 BaseBdev2_malloc 00:14:28.224 09:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:28.482 true 00:14:28.482 09:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:28.740 [2024-07-15 09:18:37.545600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:28.740 [2024-07-15 09:18:37.545646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:28.740 [2024-07-15 09:18:37.545667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x198a910 00:14:28.740 [2024-07-15 09:18:37.545680] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:28.740 [2024-07-15 09:18:37.547122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:28.740 [2024-07-15 09:18:37.547150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:28.740 BaseBdev2 00:14:28.740 09:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:28.740 09:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:28.999 BaseBdev3_malloc 00:14:28.999 09:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:29.257 true 00:14:29.257 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:29.515 [2024-07-15 09:18:38.280110] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:29.515 [2024-07-15 09:18:38.280152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:29.515 [2024-07-15 09:18:38.280170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x198cbd0 00:14:29.515 [2024-07-15 09:18:38.280183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:29.515 [2024-07-15 09:18:38.281551] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:29.515 [2024-07-15 09:18:38.281578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:29.515 BaseBdev3 00:14:29.515 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:29.774 [2024-07-15 09:18:38.524794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:29.774 [2024-07-15 09:18:38.526083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:29.774 [2024-07-15 09:18:38.526151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:29.774 [2024-07-15 09:18:38.526358] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x198e280 00:14:29.774 [2024-07-15 09:18:38.526369] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:29.774 [2024-07-15 09:18:38.526564] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198de20 00:14:29.774 [2024-07-15 09:18:38.526712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x198e280 00:14:29.774 [2024-07-15 09:18:38.526722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x198e280 00:14:29.774 [2024-07-15 09:18:38.526822] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.774 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.033 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.033 "name": "raid_bdev1", 00:14:30.033 "uuid": "09e82fe9-f95e-48b8-ad42-b27d36a13ad9", 00:14:30.033 "strip_size_kb": 64, 00:14:30.033 "state": "online", 00:14:30.033 "raid_level": "raid0", 00:14:30.033 "superblock": true, 00:14:30.033 "num_base_bdevs": 3, 00:14:30.033 "num_base_bdevs_discovered": 3, 00:14:30.033 "num_base_bdevs_operational": 3, 00:14:30.033 "base_bdevs_list": [ 00:14:30.033 { 00:14:30.033 "name": "BaseBdev1", 00:14:30.033 "uuid": "482848c9-c29d-542c-a3eb-bb88b9d99fdd", 00:14:30.033 "is_configured": true, 00:14:30.033 "data_offset": 2048, 00:14:30.033 "data_size": 63488 00:14:30.033 }, 00:14:30.033 { 00:14:30.033 "name": "BaseBdev2", 00:14:30.033 "uuid": "a5790f5b-04e3-5b68-83cd-42b0a1f99962", 00:14:30.033 "is_configured": true, 00:14:30.033 "data_offset": 2048, 00:14:30.033 "data_size": 63488 00:14:30.033 }, 00:14:30.033 { 00:14:30.033 "name": "BaseBdev3", 00:14:30.033 "uuid": "20a14c3b-55b2-500e-b690-1568bb2045d1", 00:14:30.033 "is_configured": true, 00:14:30.033 "data_offset": 2048, 00:14:30.033 "data_size": 63488 00:14:30.033 } 00:14:30.033 ] 00:14:30.033 }' 00:14:30.033 09:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.033 09:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.599 09:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:30.599 09:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:30.599 [2024-07-15 09:18:39.459568] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17dc5b0 00:14:31.602 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.861 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.127 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.127 "name": "raid_bdev1", 00:14:32.127 "uuid": "09e82fe9-f95e-48b8-ad42-b27d36a13ad9", 00:14:32.127 "strip_size_kb": 64, 00:14:32.127 "state": "online", 00:14:32.127 "raid_level": "raid0", 00:14:32.127 "superblock": true, 00:14:32.127 "num_base_bdevs": 3, 00:14:32.127 "num_base_bdevs_discovered": 3, 00:14:32.127 "num_base_bdevs_operational": 3, 00:14:32.127 "base_bdevs_list": [ 00:14:32.127 { 00:14:32.127 "name": "BaseBdev1", 00:14:32.127 "uuid": "482848c9-c29d-542c-a3eb-bb88b9d99fdd", 00:14:32.127 "is_configured": true, 00:14:32.127 "data_offset": 2048, 00:14:32.127 "data_size": 63488 00:14:32.127 }, 00:14:32.127 { 00:14:32.127 "name": "BaseBdev2", 00:14:32.127 "uuid": "a5790f5b-04e3-5b68-83cd-42b0a1f99962", 00:14:32.127 "is_configured": true, 00:14:32.127 "data_offset": 2048, 00:14:32.127 "data_size": 63488 00:14:32.127 }, 00:14:32.127 { 00:14:32.127 "name": "BaseBdev3", 00:14:32.127 "uuid": "20a14c3b-55b2-500e-b690-1568bb2045d1", 00:14:32.127 "is_configured": true, 00:14:32.127 "data_offset": 2048, 00:14:32.127 "data_size": 63488 00:14:32.127 } 00:14:32.127 ] 00:14:32.127 }' 00:14:32.127 09:18:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.127 09:18:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.694 09:18:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:32.952 [2024-07-15 09:18:41.677722] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:32.952 [2024-07-15 09:18:41.677760] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:32.952 [2024-07-15 09:18:41.680934] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:32.952 [2024-07-15 09:18:41.680972] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:32.952 [2024-07-15 09:18:41.681007] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:32.952 [2024-07-15 09:18:41.681019] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198e280 name raid_bdev1, state offline 00:14:32.952 0 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 116688 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 116688 ']' 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 116688 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 116688 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 116688' 00:14:32.952 killing process with pid 116688 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 116688 00:14:32.952 [2024-07-15 09:18:41.747571] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:32.952 09:18:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 116688 00:14:32.952 [2024-07-15 09:18:41.768806] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:33.211 09:18:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.biKWQ0Eaql 00:14:33.211 09:18:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:33.211 09:18:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:33.211 00:14:33.211 real 0m6.942s 00:14:33.211 user 0m11.011s 00:14:33.211 sys 0m1.205s 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:33.211 09:18:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.211 ************************************ 00:14:33.211 END TEST raid_write_error_test 00:14:33.211 ************************************ 00:14:33.211 09:18:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:33.211 09:18:42 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:33.211 09:18:42 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:33.211 09:18:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:33.211 09:18:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:33.211 09:18:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:33.211 ************************************ 00:14:33.211 START TEST raid_state_function_test 00:14:33.211 ************************************ 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=117674 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 117674' 00:14:33.211 Process raid pid: 117674 00:14:33.211 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 117674 /var/tmp/spdk-raid.sock 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 117674 ']' 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:33.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:33.212 09:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.469 [2024-07-15 09:18:42.166354] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:14:33.469 [2024-07-15 09:18:42.166422] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:33.469 [2024-07-15 09:18:42.297746] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:33.469 [2024-07-15 09:18:42.403829] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:33.726 [2024-07-15 09:18:42.472873] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:33.726 [2024-07-15 09:18:42.472911] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.293 09:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:34.294 09:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:34.294 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:34.552 [2024-07-15 09:18:43.324336] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:34.552 [2024-07-15 09:18:43.324377] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:34.552 [2024-07-15 09:18:43.324388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:34.552 [2024-07-15 09:18:43.324401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:34.552 [2024-07-15 09:18:43.324409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:34.552 [2024-07-15 09:18:43.324420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.552 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.810 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.810 "name": "Existed_Raid", 00:14:34.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.810 "strip_size_kb": 64, 00:14:34.810 "state": "configuring", 00:14:34.810 "raid_level": "concat", 00:14:34.810 "superblock": false, 00:14:34.810 "num_base_bdevs": 3, 00:14:34.810 "num_base_bdevs_discovered": 0, 00:14:34.810 "num_base_bdevs_operational": 3, 00:14:34.810 "base_bdevs_list": [ 00:14:34.810 { 00:14:34.810 "name": "BaseBdev1", 00:14:34.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.810 "is_configured": false, 00:14:34.810 "data_offset": 0, 00:14:34.810 "data_size": 0 00:14:34.810 }, 00:14:34.810 { 00:14:34.810 "name": "BaseBdev2", 00:14:34.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.810 "is_configured": false, 00:14:34.810 "data_offset": 0, 00:14:34.810 "data_size": 0 00:14:34.810 }, 00:14:34.810 { 00:14:34.810 "name": "BaseBdev3", 00:14:34.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.810 "is_configured": false, 00:14:34.810 "data_offset": 0, 00:14:34.810 "data_size": 0 00:14:34.810 } 00:14:34.810 ] 00:14:34.810 }' 00:14:34.810 09:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.810 09:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.377 09:18:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:35.636 [2024-07-15 09:18:44.379005] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:35.636 [2024-07-15 09:18:44.379035] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f6a80 name Existed_Raid, state configuring 00:14:35.636 09:18:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:35.893 [2024-07-15 09:18:44.623665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:35.893 [2024-07-15 09:18:44.623692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:35.893 [2024-07-15 09:18:44.623702] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:35.893 [2024-07-15 09:18:44.623714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:35.893 [2024-07-15 09:18:44.623722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:35.893 [2024-07-15 09:18:44.623733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:35.893 09:18:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:36.151 [2024-07-15 09:18:44.882180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.151 BaseBdev1 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:36.151 09:18:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.409 09:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:36.668 [ 00:14:36.668 { 00:14:36.668 "name": "BaseBdev1", 00:14:36.668 "aliases": [ 00:14:36.668 "8648ce9c-7031-4bed-8a33-3218c41aa121" 00:14:36.668 ], 00:14:36.668 "product_name": "Malloc disk", 00:14:36.668 "block_size": 512, 00:14:36.668 "num_blocks": 65536, 00:14:36.668 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:36.668 "assigned_rate_limits": { 00:14:36.668 "rw_ios_per_sec": 0, 00:14:36.668 "rw_mbytes_per_sec": 0, 00:14:36.668 "r_mbytes_per_sec": 0, 00:14:36.668 "w_mbytes_per_sec": 0 00:14:36.668 }, 00:14:36.668 "claimed": true, 00:14:36.668 "claim_type": "exclusive_write", 00:14:36.668 "zoned": false, 00:14:36.668 "supported_io_types": { 00:14:36.668 "read": true, 00:14:36.668 "write": true, 00:14:36.668 "unmap": true, 00:14:36.668 "flush": true, 00:14:36.668 "reset": true, 00:14:36.668 "nvme_admin": false, 00:14:36.668 "nvme_io": false, 00:14:36.668 "nvme_io_md": false, 00:14:36.668 "write_zeroes": true, 00:14:36.668 "zcopy": true, 00:14:36.668 "get_zone_info": false, 00:14:36.668 "zone_management": false, 00:14:36.668 "zone_append": false, 00:14:36.668 "compare": false, 00:14:36.668 "compare_and_write": false, 00:14:36.668 "abort": true, 00:14:36.668 "seek_hole": false, 00:14:36.668 "seek_data": false, 00:14:36.668 "copy": true, 00:14:36.668 "nvme_iov_md": false 00:14:36.668 }, 00:14:36.668 "memory_domains": [ 00:14:36.668 { 00:14:36.668 "dma_device_id": "system", 00:14:36.668 "dma_device_type": 1 00:14:36.668 }, 00:14:36.668 { 00:14:36.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.668 "dma_device_type": 2 00:14:36.668 } 00:14:36.668 ], 00:14:36.668 "driver_specific": {} 00:14:36.668 } 00:14:36.668 ] 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.668 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.926 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.926 "name": "Existed_Raid", 00:14:36.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.926 "strip_size_kb": 64, 00:14:36.926 "state": "configuring", 00:14:36.926 "raid_level": "concat", 00:14:36.926 "superblock": false, 00:14:36.926 "num_base_bdevs": 3, 00:14:36.926 "num_base_bdevs_discovered": 1, 00:14:36.926 "num_base_bdevs_operational": 3, 00:14:36.926 "base_bdevs_list": [ 00:14:36.926 { 00:14:36.926 "name": "BaseBdev1", 00:14:36.926 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:36.926 "is_configured": true, 00:14:36.926 "data_offset": 0, 00:14:36.926 "data_size": 65536 00:14:36.926 }, 00:14:36.926 { 00:14:36.926 "name": "BaseBdev2", 00:14:36.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.926 "is_configured": false, 00:14:36.926 "data_offset": 0, 00:14:36.926 "data_size": 0 00:14:36.926 }, 00:14:36.926 { 00:14:36.926 "name": "BaseBdev3", 00:14:36.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.926 "is_configured": false, 00:14:36.926 "data_offset": 0, 00:14:36.926 "data_size": 0 00:14:36.926 } 00:14:36.926 ] 00:14:36.926 }' 00:14:36.926 09:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.926 09:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.491 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:37.491 [2024-07-15 09:18:46.390166] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:37.491 [2024-07-15 09:18:46.390206] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f6310 name Existed_Raid, state configuring 00:14:37.491 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:37.750 [2024-07-15 09:18:46.638862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:37.750 [2024-07-15 09:18:46.640307] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:37.750 [2024-07-15 09:18:46.640341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:37.750 [2024-07-15 09:18:46.640351] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:37.750 [2024-07-15 09:18:46.640363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.750 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.008 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.008 "name": "Existed_Raid", 00:14:38.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.008 "strip_size_kb": 64, 00:14:38.008 "state": "configuring", 00:14:38.008 "raid_level": "concat", 00:14:38.008 "superblock": false, 00:14:38.008 "num_base_bdevs": 3, 00:14:38.008 "num_base_bdevs_discovered": 1, 00:14:38.008 "num_base_bdevs_operational": 3, 00:14:38.008 "base_bdevs_list": [ 00:14:38.008 { 00:14:38.008 "name": "BaseBdev1", 00:14:38.008 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:38.008 "is_configured": true, 00:14:38.008 "data_offset": 0, 00:14:38.008 "data_size": 65536 00:14:38.008 }, 00:14:38.008 { 00:14:38.008 "name": "BaseBdev2", 00:14:38.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.008 "is_configured": false, 00:14:38.008 "data_offset": 0, 00:14:38.008 "data_size": 0 00:14:38.008 }, 00:14:38.008 { 00:14:38.008 "name": "BaseBdev3", 00:14:38.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.008 "is_configured": false, 00:14:38.008 "data_offset": 0, 00:14:38.008 "data_size": 0 00:14:38.008 } 00:14:38.008 ] 00:14:38.008 }' 00:14:38.008 09:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.008 09:18:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.574 09:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:38.831 [2024-07-15 09:18:47.656908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:38.831 BaseBdev2 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.832 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.089 09:18:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:39.346 [ 00:14:39.346 { 00:14:39.346 "name": "BaseBdev2", 00:14:39.346 "aliases": [ 00:14:39.346 "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed" 00:14:39.346 ], 00:14:39.346 "product_name": "Malloc disk", 00:14:39.346 "block_size": 512, 00:14:39.346 "num_blocks": 65536, 00:14:39.346 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:39.346 "assigned_rate_limits": { 00:14:39.346 "rw_ios_per_sec": 0, 00:14:39.346 "rw_mbytes_per_sec": 0, 00:14:39.346 "r_mbytes_per_sec": 0, 00:14:39.346 "w_mbytes_per_sec": 0 00:14:39.346 }, 00:14:39.346 "claimed": true, 00:14:39.346 "claim_type": "exclusive_write", 00:14:39.346 "zoned": false, 00:14:39.346 "supported_io_types": { 00:14:39.346 "read": true, 00:14:39.346 "write": true, 00:14:39.346 "unmap": true, 00:14:39.346 "flush": true, 00:14:39.346 "reset": true, 00:14:39.346 "nvme_admin": false, 00:14:39.346 "nvme_io": false, 00:14:39.346 "nvme_io_md": false, 00:14:39.346 "write_zeroes": true, 00:14:39.346 "zcopy": true, 00:14:39.346 "get_zone_info": false, 00:14:39.346 "zone_management": false, 00:14:39.346 "zone_append": false, 00:14:39.346 "compare": false, 00:14:39.346 "compare_and_write": false, 00:14:39.346 "abort": true, 00:14:39.346 "seek_hole": false, 00:14:39.346 "seek_data": false, 00:14:39.346 "copy": true, 00:14:39.346 "nvme_iov_md": false 00:14:39.346 }, 00:14:39.346 "memory_domains": [ 00:14:39.346 { 00:14:39.346 "dma_device_id": "system", 00:14:39.346 "dma_device_type": 1 00:14:39.346 }, 00:14:39.346 { 00:14:39.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.346 "dma_device_type": 2 00:14:39.346 } 00:14:39.346 ], 00:14:39.346 "driver_specific": {} 00:14:39.346 } 00:14:39.346 ] 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.346 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.347 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.347 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.347 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.604 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.604 "name": "Existed_Raid", 00:14:39.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.604 "strip_size_kb": 64, 00:14:39.604 "state": "configuring", 00:14:39.604 "raid_level": "concat", 00:14:39.604 "superblock": false, 00:14:39.604 "num_base_bdevs": 3, 00:14:39.604 "num_base_bdevs_discovered": 2, 00:14:39.604 "num_base_bdevs_operational": 3, 00:14:39.604 "base_bdevs_list": [ 00:14:39.604 { 00:14:39.604 "name": "BaseBdev1", 00:14:39.604 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:39.604 "is_configured": true, 00:14:39.604 "data_offset": 0, 00:14:39.604 "data_size": 65536 00:14:39.604 }, 00:14:39.604 { 00:14:39.604 "name": "BaseBdev2", 00:14:39.604 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:39.604 "is_configured": true, 00:14:39.604 "data_offset": 0, 00:14:39.604 "data_size": 65536 00:14:39.604 }, 00:14:39.604 { 00:14:39.604 "name": "BaseBdev3", 00:14:39.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.604 "is_configured": false, 00:14:39.604 "data_offset": 0, 00:14:39.604 "data_size": 0 00:14:39.604 } 00:14:39.604 ] 00:14:39.604 }' 00:14:39.604 09:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.604 09:18:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.169 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:40.426 [2024-07-15 09:18:49.248591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:40.426 [2024-07-15 09:18:49.248627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f7400 00:14:40.426 [2024-07-15 09:18:49.248636] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:40.426 [2024-07-15 09:18:49.248882] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f6ef0 00:14:40.426 [2024-07-15 09:18:49.249012] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f7400 00:14:40.426 [2024-07-15 09:18:49.249022] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16f7400 00:14:40.426 [2024-07-15 09:18:49.249178] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:40.426 BaseBdev3 00:14:40.426 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:40.427 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:40.684 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:40.941 [ 00:14:40.941 { 00:14:40.941 "name": "BaseBdev3", 00:14:40.941 "aliases": [ 00:14:40.941 "f51242ab-3e3f-4e06-a1bb-a9323127c2ca" 00:14:40.941 ], 00:14:40.941 "product_name": "Malloc disk", 00:14:40.941 "block_size": 512, 00:14:40.941 "num_blocks": 65536, 00:14:40.941 "uuid": "f51242ab-3e3f-4e06-a1bb-a9323127c2ca", 00:14:40.941 "assigned_rate_limits": { 00:14:40.941 "rw_ios_per_sec": 0, 00:14:40.941 "rw_mbytes_per_sec": 0, 00:14:40.941 "r_mbytes_per_sec": 0, 00:14:40.941 "w_mbytes_per_sec": 0 00:14:40.941 }, 00:14:40.941 "claimed": true, 00:14:40.941 "claim_type": "exclusive_write", 00:14:40.941 "zoned": false, 00:14:40.941 "supported_io_types": { 00:14:40.941 "read": true, 00:14:40.941 "write": true, 00:14:40.941 "unmap": true, 00:14:40.941 "flush": true, 00:14:40.941 "reset": true, 00:14:40.941 "nvme_admin": false, 00:14:40.941 "nvme_io": false, 00:14:40.941 "nvme_io_md": false, 00:14:40.941 "write_zeroes": true, 00:14:40.941 "zcopy": true, 00:14:40.941 "get_zone_info": false, 00:14:40.941 "zone_management": false, 00:14:40.941 "zone_append": false, 00:14:40.941 "compare": false, 00:14:40.941 "compare_and_write": false, 00:14:40.941 "abort": true, 00:14:40.941 "seek_hole": false, 00:14:40.941 "seek_data": false, 00:14:40.941 "copy": true, 00:14:40.941 "nvme_iov_md": false 00:14:40.941 }, 00:14:40.941 "memory_domains": [ 00:14:40.941 { 00:14:40.941 "dma_device_id": "system", 00:14:40.941 "dma_device_type": 1 00:14:40.941 }, 00:14:40.941 { 00:14:40.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.941 "dma_device_type": 2 00:14:40.941 } 00:14:40.941 ], 00:14:40.941 "driver_specific": {} 00:14:40.941 } 00:14:40.941 ] 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.941 "name": "Existed_Raid", 00:14:40.941 "uuid": "7900e069-1785-463a-952a-ca41b5e42743", 00:14:40.941 "strip_size_kb": 64, 00:14:40.941 "state": "online", 00:14:40.941 "raid_level": "concat", 00:14:40.941 "superblock": false, 00:14:40.941 "num_base_bdevs": 3, 00:14:40.941 "num_base_bdevs_discovered": 3, 00:14:40.941 "num_base_bdevs_operational": 3, 00:14:40.941 "base_bdevs_list": [ 00:14:40.941 { 00:14:40.941 "name": "BaseBdev1", 00:14:40.941 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:40.941 "is_configured": true, 00:14:40.941 "data_offset": 0, 00:14:40.941 "data_size": 65536 00:14:40.941 }, 00:14:40.941 { 00:14:40.941 "name": "BaseBdev2", 00:14:40.941 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:40.941 "is_configured": true, 00:14:40.941 "data_offset": 0, 00:14:40.941 "data_size": 65536 00:14:40.941 }, 00:14:40.941 { 00:14:40.941 "name": "BaseBdev3", 00:14:40.941 "uuid": "f51242ab-3e3f-4e06-a1bb-a9323127c2ca", 00:14:40.941 "is_configured": true, 00:14:40.941 "data_offset": 0, 00:14:40.941 "data_size": 65536 00:14:40.941 } 00:14:40.941 ] 00:14:40.941 }' 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.941 09:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:41.874 [2024-07-15 09:18:50.700774] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:41.874 "name": "Existed_Raid", 00:14:41.874 "aliases": [ 00:14:41.874 "7900e069-1785-463a-952a-ca41b5e42743" 00:14:41.874 ], 00:14:41.874 "product_name": "Raid Volume", 00:14:41.874 "block_size": 512, 00:14:41.874 "num_blocks": 196608, 00:14:41.874 "uuid": "7900e069-1785-463a-952a-ca41b5e42743", 00:14:41.874 "assigned_rate_limits": { 00:14:41.874 "rw_ios_per_sec": 0, 00:14:41.874 "rw_mbytes_per_sec": 0, 00:14:41.874 "r_mbytes_per_sec": 0, 00:14:41.874 "w_mbytes_per_sec": 0 00:14:41.874 }, 00:14:41.874 "claimed": false, 00:14:41.874 "zoned": false, 00:14:41.874 "supported_io_types": { 00:14:41.874 "read": true, 00:14:41.874 "write": true, 00:14:41.874 "unmap": true, 00:14:41.874 "flush": true, 00:14:41.874 "reset": true, 00:14:41.874 "nvme_admin": false, 00:14:41.874 "nvme_io": false, 00:14:41.874 "nvme_io_md": false, 00:14:41.874 "write_zeroes": true, 00:14:41.874 "zcopy": false, 00:14:41.874 "get_zone_info": false, 00:14:41.874 "zone_management": false, 00:14:41.874 "zone_append": false, 00:14:41.874 "compare": false, 00:14:41.874 "compare_and_write": false, 00:14:41.874 "abort": false, 00:14:41.874 "seek_hole": false, 00:14:41.874 "seek_data": false, 00:14:41.874 "copy": false, 00:14:41.874 "nvme_iov_md": false 00:14:41.874 }, 00:14:41.874 "memory_domains": [ 00:14:41.874 { 00:14:41.874 "dma_device_id": "system", 00:14:41.874 "dma_device_type": 1 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.874 "dma_device_type": 2 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "dma_device_id": "system", 00:14:41.874 "dma_device_type": 1 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.874 "dma_device_type": 2 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "dma_device_id": "system", 00:14:41.874 "dma_device_type": 1 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.874 "dma_device_type": 2 00:14:41.874 } 00:14:41.874 ], 00:14:41.874 "driver_specific": { 00:14:41.874 "raid": { 00:14:41.874 "uuid": "7900e069-1785-463a-952a-ca41b5e42743", 00:14:41.874 "strip_size_kb": 64, 00:14:41.874 "state": "online", 00:14:41.874 "raid_level": "concat", 00:14:41.874 "superblock": false, 00:14:41.874 "num_base_bdevs": 3, 00:14:41.874 "num_base_bdevs_discovered": 3, 00:14:41.874 "num_base_bdevs_operational": 3, 00:14:41.874 "base_bdevs_list": [ 00:14:41.874 { 00:14:41.874 "name": "BaseBdev1", 00:14:41.874 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:41.874 "is_configured": true, 00:14:41.874 "data_offset": 0, 00:14:41.874 "data_size": 65536 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "name": "BaseBdev2", 00:14:41.874 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:41.874 "is_configured": true, 00:14:41.874 "data_offset": 0, 00:14:41.874 "data_size": 65536 00:14:41.874 }, 00:14:41.874 { 00:14:41.874 "name": "BaseBdev3", 00:14:41.874 "uuid": "f51242ab-3e3f-4e06-a1bb-a9323127c2ca", 00:14:41.874 "is_configured": true, 00:14:41.874 "data_offset": 0, 00:14:41.874 "data_size": 65536 00:14:41.874 } 00:14:41.874 ] 00:14:41.874 } 00:14:41.874 } 00:14:41.874 }' 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:41.874 BaseBdev2 00:14:41.874 BaseBdev3' 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:41.874 09:18:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.132 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.132 "name": "BaseBdev1", 00:14:42.132 "aliases": [ 00:14:42.132 "8648ce9c-7031-4bed-8a33-3218c41aa121" 00:14:42.132 ], 00:14:42.132 "product_name": "Malloc disk", 00:14:42.132 "block_size": 512, 00:14:42.132 "num_blocks": 65536, 00:14:42.132 "uuid": "8648ce9c-7031-4bed-8a33-3218c41aa121", 00:14:42.132 "assigned_rate_limits": { 00:14:42.132 "rw_ios_per_sec": 0, 00:14:42.132 "rw_mbytes_per_sec": 0, 00:14:42.132 "r_mbytes_per_sec": 0, 00:14:42.132 "w_mbytes_per_sec": 0 00:14:42.132 }, 00:14:42.132 "claimed": true, 00:14:42.132 "claim_type": "exclusive_write", 00:14:42.132 "zoned": false, 00:14:42.132 "supported_io_types": { 00:14:42.132 "read": true, 00:14:42.132 "write": true, 00:14:42.132 "unmap": true, 00:14:42.132 "flush": true, 00:14:42.132 "reset": true, 00:14:42.132 "nvme_admin": false, 00:14:42.132 "nvme_io": false, 00:14:42.132 "nvme_io_md": false, 00:14:42.132 "write_zeroes": true, 00:14:42.132 "zcopy": true, 00:14:42.132 "get_zone_info": false, 00:14:42.132 "zone_management": false, 00:14:42.132 "zone_append": false, 00:14:42.132 "compare": false, 00:14:42.132 "compare_and_write": false, 00:14:42.132 "abort": true, 00:14:42.132 "seek_hole": false, 00:14:42.132 "seek_data": false, 00:14:42.132 "copy": true, 00:14:42.132 "nvme_iov_md": false 00:14:42.132 }, 00:14:42.132 "memory_domains": [ 00:14:42.132 { 00:14:42.132 "dma_device_id": "system", 00:14:42.132 "dma_device_type": 1 00:14:42.132 }, 00:14:42.132 { 00:14:42.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.132 "dma_device_type": 2 00:14:42.132 } 00:14:42.132 ], 00:14:42.132 "driver_specific": {} 00:14:42.132 }' 00:14:42.132 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.132 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.390 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.648 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.649 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:42.649 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:42.649 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.908 "name": "BaseBdev2", 00:14:42.908 "aliases": [ 00:14:42.908 "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed" 00:14:42.908 ], 00:14:42.908 "product_name": "Malloc disk", 00:14:42.908 "block_size": 512, 00:14:42.908 "num_blocks": 65536, 00:14:42.908 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:42.908 "assigned_rate_limits": { 00:14:42.908 "rw_ios_per_sec": 0, 00:14:42.908 "rw_mbytes_per_sec": 0, 00:14:42.908 "r_mbytes_per_sec": 0, 00:14:42.908 "w_mbytes_per_sec": 0 00:14:42.908 }, 00:14:42.908 "claimed": true, 00:14:42.908 "claim_type": "exclusive_write", 00:14:42.908 "zoned": false, 00:14:42.908 "supported_io_types": { 00:14:42.908 "read": true, 00:14:42.908 "write": true, 00:14:42.908 "unmap": true, 00:14:42.908 "flush": true, 00:14:42.908 "reset": true, 00:14:42.908 "nvme_admin": false, 00:14:42.908 "nvme_io": false, 00:14:42.908 "nvme_io_md": false, 00:14:42.908 "write_zeroes": true, 00:14:42.908 "zcopy": true, 00:14:42.908 "get_zone_info": false, 00:14:42.908 "zone_management": false, 00:14:42.908 "zone_append": false, 00:14:42.908 "compare": false, 00:14:42.908 "compare_and_write": false, 00:14:42.908 "abort": true, 00:14:42.908 "seek_hole": false, 00:14:42.908 "seek_data": false, 00:14:42.908 "copy": true, 00:14:42.908 "nvme_iov_md": false 00:14:42.908 }, 00:14:42.908 "memory_domains": [ 00:14:42.908 { 00:14:42.908 "dma_device_id": "system", 00:14:42.908 "dma_device_type": 1 00:14:42.908 }, 00:14:42.908 { 00:14:42.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.908 "dma_device_type": 2 00:14:42.908 } 00:14:42.908 ], 00:14:42.908 "driver_specific": {} 00:14:42.908 }' 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.908 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:43.167 09:18:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:43.425 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:43.425 "name": "BaseBdev3", 00:14:43.425 "aliases": [ 00:14:43.425 "f51242ab-3e3f-4e06-a1bb-a9323127c2ca" 00:14:43.425 ], 00:14:43.425 "product_name": "Malloc disk", 00:14:43.425 "block_size": 512, 00:14:43.425 "num_blocks": 65536, 00:14:43.425 "uuid": "f51242ab-3e3f-4e06-a1bb-a9323127c2ca", 00:14:43.425 "assigned_rate_limits": { 00:14:43.425 "rw_ios_per_sec": 0, 00:14:43.425 "rw_mbytes_per_sec": 0, 00:14:43.425 "r_mbytes_per_sec": 0, 00:14:43.425 "w_mbytes_per_sec": 0 00:14:43.425 }, 00:14:43.425 "claimed": true, 00:14:43.425 "claim_type": "exclusive_write", 00:14:43.425 "zoned": false, 00:14:43.425 "supported_io_types": { 00:14:43.425 "read": true, 00:14:43.425 "write": true, 00:14:43.425 "unmap": true, 00:14:43.425 "flush": true, 00:14:43.425 "reset": true, 00:14:43.425 "nvme_admin": false, 00:14:43.425 "nvme_io": false, 00:14:43.425 "nvme_io_md": false, 00:14:43.425 "write_zeroes": true, 00:14:43.425 "zcopy": true, 00:14:43.425 "get_zone_info": false, 00:14:43.425 "zone_management": false, 00:14:43.426 "zone_append": false, 00:14:43.426 "compare": false, 00:14:43.426 "compare_and_write": false, 00:14:43.426 "abort": true, 00:14:43.426 "seek_hole": false, 00:14:43.426 "seek_data": false, 00:14:43.426 "copy": true, 00:14:43.426 "nvme_iov_md": false 00:14:43.426 }, 00:14:43.426 "memory_domains": [ 00:14:43.426 { 00:14:43.426 "dma_device_id": "system", 00:14:43.426 "dma_device_type": 1 00:14:43.426 }, 00:14:43.426 { 00:14:43.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.426 "dma_device_type": 2 00:14:43.426 } 00:14:43.426 ], 00:14:43.426 "driver_specific": {} 00:14:43.426 }' 00:14:43.426 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:43.426 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:43.426 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:43.426 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:43.426 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:43.685 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:43.943 [2024-07-15 09:18:52.778048] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:43.943 [2024-07-15 09:18:52.778075] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:43.943 [2024-07-15 09:18:52.778115] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.943 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.944 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.944 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.944 09:18:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.202 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.202 "name": "Existed_Raid", 00:14:44.202 "uuid": "7900e069-1785-463a-952a-ca41b5e42743", 00:14:44.202 "strip_size_kb": 64, 00:14:44.202 "state": "offline", 00:14:44.202 "raid_level": "concat", 00:14:44.202 "superblock": false, 00:14:44.202 "num_base_bdevs": 3, 00:14:44.202 "num_base_bdevs_discovered": 2, 00:14:44.202 "num_base_bdevs_operational": 2, 00:14:44.202 "base_bdevs_list": [ 00:14:44.202 { 00:14:44.202 "name": null, 00:14:44.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.202 "is_configured": false, 00:14:44.202 "data_offset": 0, 00:14:44.202 "data_size": 65536 00:14:44.202 }, 00:14:44.202 { 00:14:44.202 "name": "BaseBdev2", 00:14:44.202 "uuid": "e25d44dd-e5f3-4ac9-a67d-fe6673ba6bed", 00:14:44.202 "is_configured": true, 00:14:44.202 "data_offset": 0, 00:14:44.202 "data_size": 65536 00:14:44.202 }, 00:14:44.202 { 00:14:44.202 "name": "BaseBdev3", 00:14:44.202 "uuid": "f51242ab-3e3f-4e06-a1bb-a9323127c2ca", 00:14:44.202 "is_configured": true, 00:14:44.202 "data_offset": 0, 00:14:44.202 "data_size": 65536 00:14:44.202 } 00:14:44.202 ] 00:14:44.202 }' 00:14:44.202 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.202 09:18:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.779 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:44.779 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.779 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.779 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:45.037 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:45.037 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:45.037 09:18:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:45.295 [2024-07-15 09:18:54.114585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:45.295 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:45.295 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:45.295 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.295 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:45.554 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:45.554 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:45.554 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:45.813 [2024-07-15 09:18:54.622489] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:45.813 [2024-07-15 09:18:54.622536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f7400 name Existed_Raid, state offline 00:14:45.813 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:45.813 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:45.813 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.813 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:46.072 09:18:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:46.331 BaseBdev2 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:46.331 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:46.590 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:46.848 [ 00:14:46.848 { 00:14:46.848 "name": "BaseBdev2", 00:14:46.848 "aliases": [ 00:14:46.848 "b3107fb7-75ae-4ef3-88ee-e3f527e2c057" 00:14:46.848 ], 00:14:46.848 "product_name": "Malloc disk", 00:14:46.848 "block_size": 512, 00:14:46.848 "num_blocks": 65536, 00:14:46.848 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:46.848 "assigned_rate_limits": { 00:14:46.848 "rw_ios_per_sec": 0, 00:14:46.848 "rw_mbytes_per_sec": 0, 00:14:46.848 "r_mbytes_per_sec": 0, 00:14:46.848 "w_mbytes_per_sec": 0 00:14:46.848 }, 00:14:46.848 "claimed": false, 00:14:46.848 "zoned": false, 00:14:46.848 "supported_io_types": { 00:14:46.848 "read": true, 00:14:46.848 "write": true, 00:14:46.848 "unmap": true, 00:14:46.848 "flush": true, 00:14:46.848 "reset": true, 00:14:46.848 "nvme_admin": false, 00:14:46.848 "nvme_io": false, 00:14:46.848 "nvme_io_md": false, 00:14:46.848 "write_zeroes": true, 00:14:46.848 "zcopy": true, 00:14:46.848 "get_zone_info": false, 00:14:46.848 "zone_management": false, 00:14:46.848 "zone_append": false, 00:14:46.848 "compare": false, 00:14:46.848 "compare_and_write": false, 00:14:46.848 "abort": true, 00:14:46.848 "seek_hole": false, 00:14:46.848 "seek_data": false, 00:14:46.848 "copy": true, 00:14:46.848 "nvme_iov_md": false 00:14:46.848 }, 00:14:46.848 "memory_domains": [ 00:14:46.848 { 00:14:46.848 "dma_device_id": "system", 00:14:46.848 "dma_device_type": 1 00:14:46.848 }, 00:14:46.848 { 00:14:46.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.848 "dma_device_type": 2 00:14:46.848 } 00:14:46.848 ], 00:14:46.848 "driver_specific": {} 00:14:46.848 } 00:14:46.848 ] 00:14:46.848 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:46.848 09:18:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:46.848 09:18:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:46.848 09:18:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:47.107 BaseBdev3 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:47.107 09:18:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:47.366 09:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:47.626 [ 00:14:47.626 { 00:14:47.626 "name": "BaseBdev3", 00:14:47.626 "aliases": [ 00:14:47.626 "263011ef-fdbd-48b0-bc5a-6b309de223ab" 00:14:47.626 ], 00:14:47.626 "product_name": "Malloc disk", 00:14:47.626 "block_size": 512, 00:14:47.626 "num_blocks": 65536, 00:14:47.626 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:47.626 "assigned_rate_limits": { 00:14:47.626 "rw_ios_per_sec": 0, 00:14:47.626 "rw_mbytes_per_sec": 0, 00:14:47.626 "r_mbytes_per_sec": 0, 00:14:47.626 "w_mbytes_per_sec": 0 00:14:47.626 }, 00:14:47.626 "claimed": false, 00:14:47.626 "zoned": false, 00:14:47.626 "supported_io_types": { 00:14:47.626 "read": true, 00:14:47.626 "write": true, 00:14:47.626 "unmap": true, 00:14:47.626 "flush": true, 00:14:47.626 "reset": true, 00:14:47.626 "nvme_admin": false, 00:14:47.626 "nvme_io": false, 00:14:47.626 "nvme_io_md": false, 00:14:47.626 "write_zeroes": true, 00:14:47.626 "zcopy": true, 00:14:47.626 "get_zone_info": false, 00:14:47.626 "zone_management": false, 00:14:47.626 "zone_append": false, 00:14:47.626 "compare": false, 00:14:47.626 "compare_and_write": false, 00:14:47.626 "abort": true, 00:14:47.626 "seek_hole": false, 00:14:47.626 "seek_data": false, 00:14:47.626 "copy": true, 00:14:47.626 "nvme_iov_md": false 00:14:47.626 }, 00:14:47.626 "memory_domains": [ 00:14:47.626 { 00:14:47.626 "dma_device_id": "system", 00:14:47.626 "dma_device_type": 1 00:14:47.626 }, 00:14:47.626 { 00:14:47.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.626 "dma_device_type": 2 00:14:47.626 } 00:14:47.626 ], 00:14:47.626 "driver_specific": {} 00:14:47.626 } 00:14:47.626 ] 00:14:47.626 09:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:47.626 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:47.626 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:47.626 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:47.626 [2024-07-15 09:18:56.576842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:47.626 [2024-07-15 09:18:56.576885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:47.626 [2024-07-15 09:18:56.576904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:47.626 [2024-07-15 09:18:56.578277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.952 "name": "Existed_Raid", 00:14:47.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.952 "strip_size_kb": 64, 00:14:47.952 "state": "configuring", 00:14:47.952 "raid_level": "concat", 00:14:47.952 "superblock": false, 00:14:47.952 "num_base_bdevs": 3, 00:14:47.952 "num_base_bdevs_discovered": 2, 00:14:47.952 "num_base_bdevs_operational": 3, 00:14:47.952 "base_bdevs_list": [ 00:14:47.952 { 00:14:47.952 "name": "BaseBdev1", 00:14:47.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.952 "is_configured": false, 00:14:47.952 "data_offset": 0, 00:14:47.952 "data_size": 0 00:14:47.952 }, 00:14:47.952 { 00:14:47.952 "name": "BaseBdev2", 00:14:47.952 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:47.952 "is_configured": true, 00:14:47.952 "data_offset": 0, 00:14:47.952 "data_size": 65536 00:14:47.952 }, 00:14:47.952 { 00:14:47.952 "name": "BaseBdev3", 00:14:47.952 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:47.952 "is_configured": true, 00:14:47.952 "data_offset": 0, 00:14:47.952 "data_size": 65536 00:14:47.952 } 00:14:47.952 ] 00:14:47.952 }' 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.952 09:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.547 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:48.805 [2024-07-15 09:18:57.571453] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.805 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.064 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.064 "name": "Existed_Raid", 00:14:49.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.064 "strip_size_kb": 64, 00:14:49.064 "state": "configuring", 00:14:49.064 "raid_level": "concat", 00:14:49.064 "superblock": false, 00:14:49.064 "num_base_bdevs": 3, 00:14:49.064 "num_base_bdevs_discovered": 1, 00:14:49.064 "num_base_bdevs_operational": 3, 00:14:49.064 "base_bdevs_list": [ 00:14:49.064 { 00:14:49.064 "name": "BaseBdev1", 00:14:49.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.064 "is_configured": false, 00:14:49.064 "data_offset": 0, 00:14:49.064 "data_size": 0 00:14:49.064 }, 00:14:49.064 { 00:14:49.064 "name": null, 00:14:49.064 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:49.064 "is_configured": false, 00:14:49.064 "data_offset": 0, 00:14:49.064 "data_size": 65536 00:14:49.064 }, 00:14:49.064 { 00:14:49.064 "name": "BaseBdev3", 00:14:49.064 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:49.064 "is_configured": true, 00:14:49.064 "data_offset": 0, 00:14:49.064 "data_size": 65536 00:14:49.064 } 00:14:49.064 ] 00:14:49.064 }' 00:14:49.064 09:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.064 09:18:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.631 09:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.631 09:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:49.889 09:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:49.889 09:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:50.146 [2024-07-15 09:18:58.862340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:50.146 BaseBdev1 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:50.146 09:18:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:50.403 09:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:50.662 [ 00:14:50.662 { 00:14:50.662 "name": "BaseBdev1", 00:14:50.662 "aliases": [ 00:14:50.662 "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048" 00:14:50.662 ], 00:14:50.662 "product_name": "Malloc disk", 00:14:50.662 "block_size": 512, 00:14:50.662 "num_blocks": 65536, 00:14:50.662 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:50.662 "assigned_rate_limits": { 00:14:50.662 "rw_ios_per_sec": 0, 00:14:50.662 "rw_mbytes_per_sec": 0, 00:14:50.662 "r_mbytes_per_sec": 0, 00:14:50.662 "w_mbytes_per_sec": 0 00:14:50.662 }, 00:14:50.662 "claimed": true, 00:14:50.662 "claim_type": "exclusive_write", 00:14:50.662 "zoned": false, 00:14:50.662 "supported_io_types": { 00:14:50.662 "read": true, 00:14:50.662 "write": true, 00:14:50.662 "unmap": true, 00:14:50.662 "flush": true, 00:14:50.662 "reset": true, 00:14:50.662 "nvme_admin": false, 00:14:50.662 "nvme_io": false, 00:14:50.662 "nvme_io_md": false, 00:14:50.662 "write_zeroes": true, 00:14:50.662 "zcopy": true, 00:14:50.662 "get_zone_info": false, 00:14:50.662 "zone_management": false, 00:14:50.662 "zone_append": false, 00:14:50.662 "compare": false, 00:14:50.662 "compare_and_write": false, 00:14:50.662 "abort": true, 00:14:50.662 "seek_hole": false, 00:14:50.662 "seek_data": false, 00:14:50.662 "copy": true, 00:14:50.662 "nvme_iov_md": false 00:14:50.662 }, 00:14:50.662 "memory_domains": [ 00:14:50.662 { 00:14:50.662 "dma_device_id": "system", 00:14:50.662 "dma_device_type": 1 00:14:50.662 }, 00:14:50.662 { 00:14:50.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.662 "dma_device_type": 2 00:14:50.662 } 00:14:50.662 ], 00:14:50.662 "driver_specific": {} 00:14:50.662 } 00:14:50.662 ] 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.662 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.920 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.920 "name": "Existed_Raid", 00:14:50.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.920 "strip_size_kb": 64, 00:14:50.920 "state": "configuring", 00:14:50.920 "raid_level": "concat", 00:14:50.920 "superblock": false, 00:14:50.920 "num_base_bdevs": 3, 00:14:50.920 "num_base_bdevs_discovered": 2, 00:14:50.920 "num_base_bdevs_operational": 3, 00:14:50.920 "base_bdevs_list": [ 00:14:50.920 { 00:14:50.920 "name": "BaseBdev1", 00:14:50.920 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:50.920 "is_configured": true, 00:14:50.920 "data_offset": 0, 00:14:50.920 "data_size": 65536 00:14:50.920 }, 00:14:50.920 { 00:14:50.920 "name": null, 00:14:50.920 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:50.920 "is_configured": false, 00:14:50.920 "data_offset": 0, 00:14:50.920 "data_size": 65536 00:14:50.920 }, 00:14:50.920 { 00:14:50.920 "name": "BaseBdev3", 00:14:50.920 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:50.920 "is_configured": true, 00:14:50.920 "data_offset": 0, 00:14:50.920 "data_size": 65536 00:14:50.920 } 00:14:50.920 ] 00:14:50.920 }' 00:14:50.920 09:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.920 09:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.487 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.487 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:51.745 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:51.745 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:52.004 [2024-07-15 09:19:00.955956] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.262 09:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.520 09:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.520 "name": "Existed_Raid", 00:14:52.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.520 "strip_size_kb": 64, 00:14:52.520 "state": "configuring", 00:14:52.520 "raid_level": "concat", 00:14:52.520 "superblock": false, 00:14:52.520 "num_base_bdevs": 3, 00:14:52.520 "num_base_bdevs_discovered": 1, 00:14:52.520 "num_base_bdevs_operational": 3, 00:14:52.520 "base_bdevs_list": [ 00:14:52.520 { 00:14:52.520 "name": "BaseBdev1", 00:14:52.520 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:52.520 "is_configured": true, 00:14:52.520 "data_offset": 0, 00:14:52.520 "data_size": 65536 00:14:52.520 }, 00:14:52.520 { 00:14:52.520 "name": null, 00:14:52.520 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:52.520 "is_configured": false, 00:14:52.520 "data_offset": 0, 00:14:52.520 "data_size": 65536 00:14:52.520 }, 00:14:52.520 { 00:14:52.520 "name": null, 00:14:52.520 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:52.520 "is_configured": false, 00:14:52.520 "data_offset": 0, 00:14:52.520 "data_size": 65536 00:14:52.520 } 00:14:52.520 ] 00:14:52.520 }' 00:14:52.520 09:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.520 09:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.086 09:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:53.086 09:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.345 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:53.345 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:53.912 [2024-07-15 09:19:02.564230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.912 "name": "Existed_Raid", 00:14:53.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.912 "strip_size_kb": 64, 00:14:53.912 "state": "configuring", 00:14:53.912 "raid_level": "concat", 00:14:53.912 "superblock": false, 00:14:53.912 "num_base_bdevs": 3, 00:14:53.912 "num_base_bdevs_discovered": 2, 00:14:53.912 "num_base_bdevs_operational": 3, 00:14:53.912 "base_bdevs_list": [ 00:14:53.912 { 00:14:53.912 "name": "BaseBdev1", 00:14:53.912 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:53.912 "is_configured": true, 00:14:53.912 "data_offset": 0, 00:14:53.912 "data_size": 65536 00:14:53.912 }, 00:14:53.912 { 00:14:53.912 "name": null, 00:14:53.912 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:53.912 "is_configured": false, 00:14:53.912 "data_offset": 0, 00:14:53.912 "data_size": 65536 00:14:53.912 }, 00:14:53.912 { 00:14:53.912 "name": "BaseBdev3", 00:14:53.912 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:53.912 "is_configured": true, 00:14:53.912 "data_offset": 0, 00:14:53.912 "data_size": 65536 00:14:53.912 } 00:14:53.912 ] 00:14:53.912 }' 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.912 09:19:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.477 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.477 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:54.734 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:54.734 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:54.993 [2024-07-15 09:19:03.871698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.993 09:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.250 09:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.250 "name": "Existed_Raid", 00:14:55.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.250 "strip_size_kb": 64, 00:14:55.250 "state": "configuring", 00:14:55.250 "raid_level": "concat", 00:14:55.250 "superblock": false, 00:14:55.250 "num_base_bdevs": 3, 00:14:55.250 "num_base_bdevs_discovered": 1, 00:14:55.250 "num_base_bdevs_operational": 3, 00:14:55.250 "base_bdevs_list": [ 00:14:55.250 { 00:14:55.250 "name": null, 00:14:55.250 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:55.250 "is_configured": false, 00:14:55.250 "data_offset": 0, 00:14:55.250 "data_size": 65536 00:14:55.250 }, 00:14:55.250 { 00:14:55.250 "name": null, 00:14:55.250 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:55.250 "is_configured": false, 00:14:55.250 "data_offset": 0, 00:14:55.250 "data_size": 65536 00:14:55.250 }, 00:14:55.250 { 00:14:55.250 "name": "BaseBdev3", 00:14:55.250 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:55.250 "is_configured": true, 00:14:55.250 "data_offset": 0, 00:14:55.250 "data_size": 65536 00:14:55.250 } 00:14:55.250 ] 00:14:55.250 }' 00:14:55.250 09:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.250 09:19:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.815 09:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.815 09:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:56.073 09:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:56.073 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:56.332 [2024-07-15 09:19:05.155427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.332 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.591 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.591 "name": "Existed_Raid", 00:14:56.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.591 "strip_size_kb": 64, 00:14:56.591 "state": "configuring", 00:14:56.591 "raid_level": "concat", 00:14:56.591 "superblock": false, 00:14:56.591 "num_base_bdevs": 3, 00:14:56.591 "num_base_bdevs_discovered": 2, 00:14:56.591 "num_base_bdevs_operational": 3, 00:14:56.591 "base_bdevs_list": [ 00:14:56.591 { 00:14:56.591 "name": null, 00:14:56.591 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:56.591 "is_configured": false, 00:14:56.591 "data_offset": 0, 00:14:56.591 "data_size": 65536 00:14:56.591 }, 00:14:56.591 { 00:14:56.591 "name": "BaseBdev2", 00:14:56.591 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:56.591 "is_configured": true, 00:14:56.591 "data_offset": 0, 00:14:56.591 "data_size": 65536 00:14:56.591 }, 00:14:56.591 { 00:14:56.591 "name": "BaseBdev3", 00:14:56.591 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:56.591 "is_configured": true, 00:14:56.591 "data_offset": 0, 00:14:56.591 "data_size": 65536 00:14:56.591 } 00:14:56.591 ] 00:14:56.591 }' 00:14:56.591 09:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.591 09:19:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.159 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.159 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:57.417 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:57.417 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.417 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:57.675 09:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4d0b4b5e-6d94-4531-afeb-5cc8e9c22048 00:14:58.242 [2024-07-15 09:19:07.012138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:58.242 [2024-07-15 09:19:07.012175] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f5450 00:14:58.242 [2024-07-15 09:19:07.012184] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:58.242 [2024-07-15 09:19:07.012372] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f6ed0 00:14:58.242 [2024-07-15 09:19:07.012484] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f5450 00:14:58.242 [2024-07-15 09:19:07.012494] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16f5450 00:14:58.242 [2024-07-15 09:19:07.012654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:58.242 NewBaseBdev 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:58.242 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:58.501 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:59.067 [ 00:14:59.067 { 00:14:59.067 "name": "NewBaseBdev", 00:14:59.067 "aliases": [ 00:14:59.067 "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048" 00:14:59.067 ], 00:14:59.067 "product_name": "Malloc disk", 00:14:59.067 "block_size": 512, 00:14:59.067 "num_blocks": 65536, 00:14:59.067 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:59.067 "assigned_rate_limits": { 00:14:59.067 "rw_ios_per_sec": 0, 00:14:59.067 "rw_mbytes_per_sec": 0, 00:14:59.067 "r_mbytes_per_sec": 0, 00:14:59.067 "w_mbytes_per_sec": 0 00:14:59.067 }, 00:14:59.067 "claimed": true, 00:14:59.067 "claim_type": "exclusive_write", 00:14:59.067 "zoned": false, 00:14:59.067 "supported_io_types": { 00:14:59.067 "read": true, 00:14:59.067 "write": true, 00:14:59.067 "unmap": true, 00:14:59.067 "flush": true, 00:14:59.067 "reset": true, 00:14:59.067 "nvme_admin": false, 00:14:59.067 "nvme_io": false, 00:14:59.067 "nvme_io_md": false, 00:14:59.067 "write_zeroes": true, 00:14:59.067 "zcopy": true, 00:14:59.067 "get_zone_info": false, 00:14:59.067 "zone_management": false, 00:14:59.067 "zone_append": false, 00:14:59.067 "compare": false, 00:14:59.067 "compare_and_write": false, 00:14:59.067 "abort": true, 00:14:59.067 "seek_hole": false, 00:14:59.067 "seek_data": false, 00:14:59.067 "copy": true, 00:14:59.067 "nvme_iov_md": false 00:14:59.067 }, 00:14:59.067 "memory_domains": [ 00:14:59.067 { 00:14:59.067 "dma_device_id": "system", 00:14:59.067 "dma_device_type": 1 00:14:59.067 }, 00:14:59.067 { 00:14:59.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.067 "dma_device_type": 2 00:14:59.067 } 00:14:59.067 ], 00:14:59.067 "driver_specific": {} 00:14:59.067 } 00:14:59.067 ] 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.067 09:19:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.325 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.325 "name": "Existed_Raid", 00:14:59.325 "uuid": "e570dd29-54b9-4592-b2d4-6af6c93e9e85", 00:14:59.325 "strip_size_kb": 64, 00:14:59.325 "state": "online", 00:14:59.325 "raid_level": "concat", 00:14:59.325 "superblock": false, 00:14:59.325 "num_base_bdevs": 3, 00:14:59.325 "num_base_bdevs_discovered": 3, 00:14:59.325 "num_base_bdevs_operational": 3, 00:14:59.325 "base_bdevs_list": [ 00:14:59.325 { 00:14:59.325 "name": "NewBaseBdev", 00:14:59.325 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:14:59.325 "is_configured": true, 00:14:59.325 "data_offset": 0, 00:14:59.325 "data_size": 65536 00:14:59.325 }, 00:14:59.325 { 00:14:59.326 "name": "BaseBdev2", 00:14:59.326 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:14:59.326 "is_configured": true, 00:14:59.326 "data_offset": 0, 00:14:59.326 "data_size": 65536 00:14:59.326 }, 00:14:59.326 { 00:14:59.326 "name": "BaseBdev3", 00:14:59.326 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:14:59.326 "is_configured": true, 00:14:59.326 "data_offset": 0, 00:14:59.326 "data_size": 65536 00:14:59.326 } 00:14:59.326 ] 00:14:59.326 }' 00:14:59.326 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.326 09:19:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:59.891 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:00.149 [2024-07-15 09:19:08.873561] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:00.149 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:00.149 "name": "Existed_Raid", 00:15:00.149 "aliases": [ 00:15:00.149 "e570dd29-54b9-4592-b2d4-6af6c93e9e85" 00:15:00.149 ], 00:15:00.149 "product_name": "Raid Volume", 00:15:00.149 "block_size": 512, 00:15:00.149 "num_blocks": 196608, 00:15:00.149 "uuid": "e570dd29-54b9-4592-b2d4-6af6c93e9e85", 00:15:00.149 "assigned_rate_limits": { 00:15:00.149 "rw_ios_per_sec": 0, 00:15:00.149 "rw_mbytes_per_sec": 0, 00:15:00.149 "r_mbytes_per_sec": 0, 00:15:00.149 "w_mbytes_per_sec": 0 00:15:00.149 }, 00:15:00.149 "claimed": false, 00:15:00.149 "zoned": false, 00:15:00.149 "supported_io_types": { 00:15:00.149 "read": true, 00:15:00.149 "write": true, 00:15:00.149 "unmap": true, 00:15:00.149 "flush": true, 00:15:00.149 "reset": true, 00:15:00.149 "nvme_admin": false, 00:15:00.149 "nvme_io": false, 00:15:00.149 "nvme_io_md": false, 00:15:00.149 "write_zeroes": true, 00:15:00.149 "zcopy": false, 00:15:00.149 "get_zone_info": false, 00:15:00.149 "zone_management": false, 00:15:00.149 "zone_append": false, 00:15:00.149 "compare": false, 00:15:00.149 "compare_and_write": false, 00:15:00.149 "abort": false, 00:15:00.149 "seek_hole": false, 00:15:00.149 "seek_data": false, 00:15:00.149 "copy": false, 00:15:00.149 "nvme_iov_md": false 00:15:00.149 }, 00:15:00.149 "memory_domains": [ 00:15:00.149 { 00:15:00.149 "dma_device_id": "system", 00:15:00.149 "dma_device_type": 1 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.149 "dma_device_type": 2 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "dma_device_id": "system", 00:15:00.149 "dma_device_type": 1 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.149 "dma_device_type": 2 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "dma_device_id": "system", 00:15:00.149 "dma_device_type": 1 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.149 "dma_device_type": 2 00:15:00.149 } 00:15:00.149 ], 00:15:00.149 "driver_specific": { 00:15:00.149 "raid": { 00:15:00.149 "uuid": "e570dd29-54b9-4592-b2d4-6af6c93e9e85", 00:15:00.149 "strip_size_kb": 64, 00:15:00.149 "state": "online", 00:15:00.149 "raid_level": "concat", 00:15:00.149 "superblock": false, 00:15:00.149 "num_base_bdevs": 3, 00:15:00.149 "num_base_bdevs_discovered": 3, 00:15:00.149 "num_base_bdevs_operational": 3, 00:15:00.149 "base_bdevs_list": [ 00:15:00.149 { 00:15:00.149 "name": "NewBaseBdev", 00:15:00.149 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:15:00.149 "is_configured": true, 00:15:00.149 "data_offset": 0, 00:15:00.149 "data_size": 65536 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "name": "BaseBdev2", 00:15:00.149 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:15:00.149 "is_configured": true, 00:15:00.149 "data_offset": 0, 00:15:00.149 "data_size": 65536 00:15:00.149 }, 00:15:00.149 { 00:15:00.149 "name": "BaseBdev3", 00:15:00.149 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:15:00.149 "is_configured": true, 00:15:00.149 "data_offset": 0, 00:15:00.149 "data_size": 65536 00:15:00.149 } 00:15:00.149 ] 00:15:00.149 } 00:15:00.150 } 00:15:00.150 }' 00:15:00.150 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:00.150 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:00.150 BaseBdev2 00:15:00.150 BaseBdev3' 00:15:00.150 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:00.150 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:00.150 09:19:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:00.407 "name": "NewBaseBdev", 00:15:00.407 "aliases": [ 00:15:00.407 "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048" 00:15:00.407 ], 00:15:00.407 "product_name": "Malloc disk", 00:15:00.407 "block_size": 512, 00:15:00.407 "num_blocks": 65536, 00:15:00.407 "uuid": "4d0b4b5e-6d94-4531-afeb-5cc8e9c22048", 00:15:00.407 "assigned_rate_limits": { 00:15:00.407 "rw_ios_per_sec": 0, 00:15:00.407 "rw_mbytes_per_sec": 0, 00:15:00.407 "r_mbytes_per_sec": 0, 00:15:00.407 "w_mbytes_per_sec": 0 00:15:00.407 }, 00:15:00.407 "claimed": true, 00:15:00.407 "claim_type": "exclusive_write", 00:15:00.407 "zoned": false, 00:15:00.407 "supported_io_types": { 00:15:00.407 "read": true, 00:15:00.407 "write": true, 00:15:00.407 "unmap": true, 00:15:00.407 "flush": true, 00:15:00.407 "reset": true, 00:15:00.407 "nvme_admin": false, 00:15:00.407 "nvme_io": false, 00:15:00.407 "nvme_io_md": false, 00:15:00.407 "write_zeroes": true, 00:15:00.407 "zcopy": true, 00:15:00.407 "get_zone_info": false, 00:15:00.407 "zone_management": false, 00:15:00.407 "zone_append": false, 00:15:00.407 "compare": false, 00:15:00.407 "compare_and_write": false, 00:15:00.407 "abort": true, 00:15:00.407 "seek_hole": false, 00:15:00.407 "seek_data": false, 00:15:00.407 "copy": true, 00:15:00.407 "nvme_iov_md": false 00:15:00.407 }, 00:15:00.407 "memory_domains": [ 00:15:00.407 { 00:15:00.407 "dma_device_id": "system", 00:15:00.407 "dma_device_type": 1 00:15:00.407 }, 00:15:00.407 { 00:15:00.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.407 "dma_device_type": 2 00:15:00.407 } 00:15:00.407 ], 00:15:00.407 "driver_specific": {} 00:15:00.407 }' 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:00.407 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.663 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:00.664 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:00.921 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:00.921 "name": "BaseBdev2", 00:15:00.921 "aliases": [ 00:15:00.921 "b3107fb7-75ae-4ef3-88ee-e3f527e2c057" 00:15:00.921 ], 00:15:00.921 "product_name": "Malloc disk", 00:15:00.921 "block_size": 512, 00:15:00.921 "num_blocks": 65536, 00:15:00.921 "uuid": "b3107fb7-75ae-4ef3-88ee-e3f527e2c057", 00:15:00.921 "assigned_rate_limits": { 00:15:00.921 "rw_ios_per_sec": 0, 00:15:00.921 "rw_mbytes_per_sec": 0, 00:15:00.921 "r_mbytes_per_sec": 0, 00:15:00.921 "w_mbytes_per_sec": 0 00:15:00.921 }, 00:15:00.921 "claimed": true, 00:15:00.921 "claim_type": "exclusive_write", 00:15:00.921 "zoned": false, 00:15:00.921 "supported_io_types": { 00:15:00.921 "read": true, 00:15:00.921 "write": true, 00:15:00.921 "unmap": true, 00:15:00.921 "flush": true, 00:15:00.921 "reset": true, 00:15:00.921 "nvme_admin": false, 00:15:00.921 "nvme_io": false, 00:15:00.921 "nvme_io_md": false, 00:15:00.921 "write_zeroes": true, 00:15:00.921 "zcopy": true, 00:15:00.921 "get_zone_info": false, 00:15:00.921 "zone_management": false, 00:15:00.921 "zone_append": false, 00:15:00.921 "compare": false, 00:15:00.921 "compare_and_write": false, 00:15:00.921 "abort": true, 00:15:00.921 "seek_hole": false, 00:15:00.921 "seek_data": false, 00:15:00.921 "copy": true, 00:15:00.921 "nvme_iov_md": false 00:15:00.921 }, 00:15:00.921 "memory_domains": [ 00:15:00.921 { 00:15:00.921 "dma_device_id": "system", 00:15:00.921 "dma_device_type": 1 00:15:00.921 }, 00:15:00.921 { 00:15:00.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.921 "dma_device_type": 2 00:15:00.921 } 00:15:00.921 ], 00:15:00.921 "driver_specific": {} 00:15:00.921 }' 00:15:00.921 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.921 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.921 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:00.921 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.178 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.178 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:01.178 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.178 09:19:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:01.178 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:01.435 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:01.435 "name": "BaseBdev3", 00:15:01.435 "aliases": [ 00:15:01.435 "263011ef-fdbd-48b0-bc5a-6b309de223ab" 00:15:01.435 ], 00:15:01.435 "product_name": "Malloc disk", 00:15:01.435 "block_size": 512, 00:15:01.435 "num_blocks": 65536, 00:15:01.435 "uuid": "263011ef-fdbd-48b0-bc5a-6b309de223ab", 00:15:01.435 "assigned_rate_limits": { 00:15:01.435 "rw_ios_per_sec": 0, 00:15:01.435 "rw_mbytes_per_sec": 0, 00:15:01.435 "r_mbytes_per_sec": 0, 00:15:01.435 "w_mbytes_per_sec": 0 00:15:01.435 }, 00:15:01.435 "claimed": true, 00:15:01.435 "claim_type": "exclusive_write", 00:15:01.435 "zoned": false, 00:15:01.435 "supported_io_types": { 00:15:01.435 "read": true, 00:15:01.435 "write": true, 00:15:01.435 "unmap": true, 00:15:01.435 "flush": true, 00:15:01.435 "reset": true, 00:15:01.435 "nvme_admin": false, 00:15:01.435 "nvme_io": false, 00:15:01.435 "nvme_io_md": false, 00:15:01.435 "write_zeroes": true, 00:15:01.435 "zcopy": true, 00:15:01.435 "get_zone_info": false, 00:15:01.435 "zone_management": false, 00:15:01.435 "zone_append": false, 00:15:01.435 "compare": false, 00:15:01.435 "compare_and_write": false, 00:15:01.435 "abort": true, 00:15:01.435 "seek_hole": false, 00:15:01.435 "seek_data": false, 00:15:01.435 "copy": true, 00:15:01.435 "nvme_iov_md": false 00:15:01.435 }, 00:15:01.435 "memory_domains": [ 00:15:01.435 { 00:15:01.435 "dma_device_id": "system", 00:15:01.435 "dma_device_type": 1 00:15:01.435 }, 00:15:01.435 { 00:15:01.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.435 "dma_device_type": 2 00:15:01.435 } 00:15:01.435 ], 00:15:01.435 "driver_specific": {} 00:15:01.435 }' 00:15:01.435 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.692 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.962 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.962 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.962 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:02.219 [2024-07-15 09:19:10.922813] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:02.219 [2024-07-15 09:19:10.922840] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:02.219 [2024-07-15 09:19:10.922889] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:02.219 [2024-07-15 09:19:10.922945] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:02.219 [2024-07-15 09:19:10.922958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f5450 name Existed_Raid, state offline 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 117674 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 117674 ']' 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 117674 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 117674 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 117674' 00:15:02.219 killing process with pid 117674 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 117674 00:15:02.219 [2024-07-15 09:19:10.990297] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:02.219 09:19:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 117674 00:15:02.219 [2024-07-15 09:19:11.020864] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:02.477 00:15:02.477 real 0m29.151s 00:15:02.477 user 0m53.502s 00:15:02.477 sys 0m5.158s 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.477 ************************************ 00:15:02.477 END TEST raid_state_function_test 00:15:02.477 ************************************ 00:15:02.477 09:19:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:02.477 09:19:11 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:02.477 09:19:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:02.477 09:19:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:02.477 09:19:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:02.477 ************************************ 00:15:02.477 START TEST raid_state_function_test_sb 00:15:02.477 ************************************ 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=122134 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 122134' 00:15:02.477 Process raid pid: 122134 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 122134 /var/tmp/spdk-raid.sock 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 122134 ']' 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:02.477 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:02.477 09:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:02.477 [2024-07-15 09:19:11.392617] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:15:02.477 [2024-07-15 09:19:11.392683] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:02.734 [2024-07-15 09:19:11.521812] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:02.734 [2024-07-15 09:19:11.623485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:02.734 [2024-07-15 09:19:11.686486] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:02.734 [2024-07-15 09:19:11.686521] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:03.666 [2024-07-15 09:19:12.546669] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:03.666 [2024-07-15 09:19:12.546713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:03.666 [2024-07-15 09:19:12.546724] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:03.666 [2024-07-15 09:19:12.546736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:03.666 [2024-07-15 09:19:12.546745] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:03.666 [2024-07-15 09:19:12.546756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.666 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.930 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.930 "name": "Existed_Raid", 00:15:03.930 "uuid": "59a22831-9e4d-48c4-a44d-a56170eeed67", 00:15:03.930 "strip_size_kb": 64, 00:15:03.930 "state": "configuring", 00:15:03.930 "raid_level": "concat", 00:15:03.930 "superblock": true, 00:15:03.930 "num_base_bdevs": 3, 00:15:03.930 "num_base_bdevs_discovered": 0, 00:15:03.930 "num_base_bdevs_operational": 3, 00:15:03.930 "base_bdevs_list": [ 00:15:03.930 { 00:15:03.930 "name": "BaseBdev1", 00:15:03.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.930 "is_configured": false, 00:15:03.930 "data_offset": 0, 00:15:03.930 "data_size": 0 00:15:03.930 }, 00:15:03.930 { 00:15:03.930 "name": "BaseBdev2", 00:15:03.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.930 "is_configured": false, 00:15:03.930 "data_offset": 0, 00:15:03.930 "data_size": 0 00:15:03.930 }, 00:15:03.930 { 00:15:03.930 "name": "BaseBdev3", 00:15:03.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.930 "is_configured": false, 00:15:03.930 "data_offset": 0, 00:15:03.930 "data_size": 0 00:15:03.930 } 00:15:03.930 ] 00:15:03.930 }' 00:15:03.930 09:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.930 09:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.502 09:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:04.768 [2024-07-15 09:19:13.645416] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:04.768 [2024-07-15 09:19:13.645448] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b05a80 name Existed_Raid, state configuring 00:15:04.768 09:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:05.058 [2024-07-15 09:19:13.886079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:05.058 [2024-07-15 09:19:13.886109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:05.058 [2024-07-15 09:19:13.886119] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:05.058 [2024-07-15 09:19:13.886131] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:05.058 [2024-07-15 09:19:13.886139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:05.058 [2024-07-15 09:19:13.886151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:05.058 09:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:05.316 [2024-07-15 09:19:14.141785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.316 BaseBdev1 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:05.316 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.575 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:05.833 [ 00:15:05.833 { 00:15:05.833 "name": "BaseBdev1", 00:15:05.833 "aliases": [ 00:15:05.833 "a3efdc28-7224-4aeb-976b-74fe2ec50323" 00:15:05.833 ], 00:15:05.833 "product_name": "Malloc disk", 00:15:05.833 "block_size": 512, 00:15:05.833 "num_blocks": 65536, 00:15:05.833 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:05.833 "assigned_rate_limits": { 00:15:05.834 "rw_ios_per_sec": 0, 00:15:05.834 "rw_mbytes_per_sec": 0, 00:15:05.834 "r_mbytes_per_sec": 0, 00:15:05.834 "w_mbytes_per_sec": 0 00:15:05.834 }, 00:15:05.834 "claimed": true, 00:15:05.834 "claim_type": "exclusive_write", 00:15:05.834 "zoned": false, 00:15:05.834 "supported_io_types": { 00:15:05.834 "read": true, 00:15:05.834 "write": true, 00:15:05.834 "unmap": true, 00:15:05.834 "flush": true, 00:15:05.834 "reset": true, 00:15:05.834 "nvme_admin": false, 00:15:05.834 "nvme_io": false, 00:15:05.834 "nvme_io_md": false, 00:15:05.834 "write_zeroes": true, 00:15:05.834 "zcopy": true, 00:15:05.834 "get_zone_info": false, 00:15:05.834 "zone_management": false, 00:15:05.834 "zone_append": false, 00:15:05.834 "compare": false, 00:15:05.834 "compare_and_write": false, 00:15:05.834 "abort": true, 00:15:05.834 "seek_hole": false, 00:15:05.834 "seek_data": false, 00:15:05.834 "copy": true, 00:15:05.834 "nvme_iov_md": false 00:15:05.834 }, 00:15:05.834 "memory_domains": [ 00:15:05.834 { 00:15:05.834 "dma_device_id": "system", 00:15:05.834 "dma_device_type": 1 00:15:05.834 }, 00:15:05.834 { 00:15:05.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.834 "dma_device_type": 2 00:15:05.834 } 00:15:05.834 ], 00:15:05.834 "driver_specific": {} 00:15:05.834 } 00:15:05.834 ] 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.834 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.092 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.092 "name": "Existed_Raid", 00:15:06.092 "uuid": "fc328e47-484f-4fc7-bfe7-c721b260ef20", 00:15:06.092 "strip_size_kb": 64, 00:15:06.092 "state": "configuring", 00:15:06.092 "raid_level": "concat", 00:15:06.092 "superblock": true, 00:15:06.092 "num_base_bdevs": 3, 00:15:06.092 "num_base_bdevs_discovered": 1, 00:15:06.092 "num_base_bdevs_operational": 3, 00:15:06.092 "base_bdevs_list": [ 00:15:06.092 { 00:15:06.092 "name": "BaseBdev1", 00:15:06.092 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:06.092 "is_configured": true, 00:15:06.092 "data_offset": 2048, 00:15:06.092 "data_size": 63488 00:15:06.092 }, 00:15:06.092 { 00:15:06.092 "name": "BaseBdev2", 00:15:06.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.092 "is_configured": false, 00:15:06.092 "data_offset": 0, 00:15:06.092 "data_size": 0 00:15:06.092 }, 00:15:06.092 { 00:15:06.092 "name": "BaseBdev3", 00:15:06.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.092 "is_configured": false, 00:15:06.092 "data_offset": 0, 00:15:06.092 "data_size": 0 00:15:06.092 } 00:15:06.092 ] 00:15:06.092 }' 00:15:06.092 09:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.092 09:19:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:06.657 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:06.916 [2024-07-15 09:19:15.717961] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:06.916 [2024-07-15 09:19:15.717998] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b05310 name Existed_Raid, state configuring 00:15:06.916 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.174 [2024-07-15 09:19:15.962640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:07.174 [2024-07-15 09:19:15.964078] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.174 [2024-07-15 09:19:15.964110] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.174 [2024-07-15 09:19:15.964120] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.174 [2024-07-15 09:19:15.964132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.174 09:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.431 09:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.431 "name": "Existed_Raid", 00:15:07.431 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:07.431 "strip_size_kb": 64, 00:15:07.431 "state": "configuring", 00:15:07.431 "raid_level": "concat", 00:15:07.431 "superblock": true, 00:15:07.431 "num_base_bdevs": 3, 00:15:07.431 "num_base_bdevs_discovered": 1, 00:15:07.431 "num_base_bdevs_operational": 3, 00:15:07.431 "base_bdevs_list": [ 00:15:07.431 { 00:15:07.431 "name": "BaseBdev1", 00:15:07.431 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:07.431 "is_configured": true, 00:15:07.431 "data_offset": 2048, 00:15:07.431 "data_size": 63488 00:15:07.431 }, 00:15:07.431 { 00:15:07.431 "name": "BaseBdev2", 00:15:07.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.431 "is_configured": false, 00:15:07.431 "data_offset": 0, 00:15:07.431 "data_size": 0 00:15:07.431 }, 00:15:07.431 { 00:15:07.431 "name": "BaseBdev3", 00:15:07.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.431 "is_configured": false, 00:15:07.431 "data_offset": 0, 00:15:07.431 "data_size": 0 00:15:07.431 } 00:15:07.431 ] 00:15:07.431 }' 00:15:07.431 09:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.431 09:19:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.996 09:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:08.255 [2024-07-15 09:19:17.073783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:08.255 BaseBdev2 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:08.255 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.515 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:08.773 [ 00:15:08.773 { 00:15:08.773 "name": "BaseBdev2", 00:15:08.773 "aliases": [ 00:15:08.773 "8452022d-9972-4af6-9e98-3c54d687b973" 00:15:08.773 ], 00:15:08.773 "product_name": "Malloc disk", 00:15:08.773 "block_size": 512, 00:15:08.773 "num_blocks": 65536, 00:15:08.773 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:08.773 "assigned_rate_limits": { 00:15:08.773 "rw_ios_per_sec": 0, 00:15:08.773 "rw_mbytes_per_sec": 0, 00:15:08.773 "r_mbytes_per_sec": 0, 00:15:08.773 "w_mbytes_per_sec": 0 00:15:08.773 }, 00:15:08.773 "claimed": true, 00:15:08.773 "claim_type": "exclusive_write", 00:15:08.773 "zoned": false, 00:15:08.773 "supported_io_types": { 00:15:08.773 "read": true, 00:15:08.773 "write": true, 00:15:08.773 "unmap": true, 00:15:08.773 "flush": true, 00:15:08.773 "reset": true, 00:15:08.773 "nvme_admin": false, 00:15:08.773 "nvme_io": false, 00:15:08.773 "nvme_io_md": false, 00:15:08.773 "write_zeroes": true, 00:15:08.773 "zcopy": true, 00:15:08.773 "get_zone_info": false, 00:15:08.773 "zone_management": false, 00:15:08.773 "zone_append": false, 00:15:08.773 "compare": false, 00:15:08.773 "compare_and_write": false, 00:15:08.773 "abort": true, 00:15:08.773 "seek_hole": false, 00:15:08.773 "seek_data": false, 00:15:08.773 "copy": true, 00:15:08.773 "nvme_iov_md": false 00:15:08.773 }, 00:15:08.773 "memory_domains": [ 00:15:08.773 { 00:15:08.773 "dma_device_id": "system", 00:15:08.773 "dma_device_type": 1 00:15:08.773 }, 00:15:08.773 { 00:15:08.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.773 "dma_device_type": 2 00:15:08.773 } 00:15:08.773 ], 00:15:08.773 "driver_specific": {} 00:15:08.773 } 00:15:08.773 ] 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.773 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.032 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.032 "name": "Existed_Raid", 00:15:09.032 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:09.032 "strip_size_kb": 64, 00:15:09.032 "state": "configuring", 00:15:09.032 "raid_level": "concat", 00:15:09.032 "superblock": true, 00:15:09.032 "num_base_bdevs": 3, 00:15:09.032 "num_base_bdevs_discovered": 2, 00:15:09.032 "num_base_bdevs_operational": 3, 00:15:09.032 "base_bdevs_list": [ 00:15:09.032 { 00:15:09.032 "name": "BaseBdev1", 00:15:09.032 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:09.032 "is_configured": true, 00:15:09.032 "data_offset": 2048, 00:15:09.032 "data_size": 63488 00:15:09.032 }, 00:15:09.032 { 00:15:09.032 "name": "BaseBdev2", 00:15:09.032 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:09.032 "is_configured": true, 00:15:09.032 "data_offset": 2048, 00:15:09.032 "data_size": 63488 00:15:09.032 }, 00:15:09.032 { 00:15:09.032 "name": "BaseBdev3", 00:15:09.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.032 "is_configured": false, 00:15:09.032 "data_offset": 0, 00:15:09.032 "data_size": 0 00:15:09.032 } 00:15:09.032 ] 00:15:09.032 }' 00:15:09.032 09:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.032 09:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:09.598 09:19:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:09.855 [2024-07-15 09:19:18.641424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:09.855 [2024-07-15 09:19:18.641582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b06400 00:15:09.855 [2024-07-15 09:19:18.641595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:09.855 [2024-07-15 09:19:18.641763] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b05ef0 00:15:09.855 [2024-07-15 09:19:18.641877] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b06400 00:15:09.855 [2024-07-15 09:19:18.641886] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b06400 00:15:09.855 [2024-07-15 09:19:18.641986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:09.855 BaseBdev3 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:09.855 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.113 09:19:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:10.370 [ 00:15:10.370 { 00:15:10.370 "name": "BaseBdev3", 00:15:10.370 "aliases": [ 00:15:10.370 "0466c6ea-bc6f-472a-a866-e0c392df5ec9" 00:15:10.370 ], 00:15:10.370 "product_name": "Malloc disk", 00:15:10.370 "block_size": 512, 00:15:10.370 "num_blocks": 65536, 00:15:10.370 "uuid": "0466c6ea-bc6f-472a-a866-e0c392df5ec9", 00:15:10.370 "assigned_rate_limits": { 00:15:10.370 "rw_ios_per_sec": 0, 00:15:10.370 "rw_mbytes_per_sec": 0, 00:15:10.370 "r_mbytes_per_sec": 0, 00:15:10.370 "w_mbytes_per_sec": 0 00:15:10.370 }, 00:15:10.370 "claimed": true, 00:15:10.370 "claim_type": "exclusive_write", 00:15:10.371 "zoned": false, 00:15:10.371 "supported_io_types": { 00:15:10.371 "read": true, 00:15:10.371 "write": true, 00:15:10.371 "unmap": true, 00:15:10.371 "flush": true, 00:15:10.371 "reset": true, 00:15:10.371 "nvme_admin": false, 00:15:10.371 "nvme_io": false, 00:15:10.371 "nvme_io_md": false, 00:15:10.371 "write_zeroes": true, 00:15:10.371 "zcopy": true, 00:15:10.371 "get_zone_info": false, 00:15:10.371 "zone_management": false, 00:15:10.371 "zone_append": false, 00:15:10.371 "compare": false, 00:15:10.371 "compare_and_write": false, 00:15:10.371 "abort": true, 00:15:10.371 "seek_hole": false, 00:15:10.371 "seek_data": false, 00:15:10.371 "copy": true, 00:15:10.371 "nvme_iov_md": false 00:15:10.371 }, 00:15:10.371 "memory_domains": [ 00:15:10.371 { 00:15:10.371 "dma_device_id": "system", 00:15:10.371 "dma_device_type": 1 00:15:10.371 }, 00:15:10.371 { 00:15:10.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.371 "dma_device_type": 2 00:15:10.371 } 00:15:10.371 ], 00:15:10.371 "driver_specific": {} 00:15:10.371 } 00:15:10.371 ] 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.371 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.630 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.630 "name": "Existed_Raid", 00:15:10.630 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:10.630 "strip_size_kb": 64, 00:15:10.630 "state": "online", 00:15:10.630 "raid_level": "concat", 00:15:10.630 "superblock": true, 00:15:10.630 "num_base_bdevs": 3, 00:15:10.630 "num_base_bdevs_discovered": 3, 00:15:10.630 "num_base_bdevs_operational": 3, 00:15:10.630 "base_bdevs_list": [ 00:15:10.630 { 00:15:10.630 "name": "BaseBdev1", 00:15:10.630 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:10.630 "is_configured": true, 00:15:10.630 "data_offset": 2048, 00:15:10.630 "data_size": 63488 00:15:10.630 }, 00:15:10.630 { 00:15:10.630 "name": "BaseBdev2", 00:15:10.630 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:10.630 "is_configured": true, 00:15:10.630 "data_offset": 2048, 00:15:10.630 "data_size": 63488 00:15:10.630 }, 00:15:10.630 { 00:15:10.630 "name": "BaseBdev3", 00:15:10.630 "uuid": "0466c6ea-bc6f-472a-a866-e0c392df5ec9", 00:15:10.630 "is_configured": true, 00:15:10.630 "data_offset": 2048, 00:15:10.630 "data_size": 63488 00:15:10.630 } 00:15:10.630 ] 00:15:10.630 }' 00:15:10.630 09:19:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.630 09:19:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:11.198 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:11.456 [2024-07-15 09:19:20.233972] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:11.457 "name": "Existed_Raid", 00:15:11.457 "aliases": [ 00:15:11.457 "9d1b8a66-bfd3-4508-b25a-b2dee2011991" 00:15:11.457 ], 00:15:11.457 "product_name": "Raid Volume", 00:15:11.457 "block_size": 512, 00:15:11.457 "num_blocks": 190464, 00:15:11.457 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:11.457 "assigned_rate_limits": { 00:15:11.457 "rw_ios_per_sec": 0, 00:15:11.457 "rw_mbytes_per_sec": 0, 00:15:11.457 "r_mbytes_per_sec": 0, 00:15:11.457 "w_mbytes_per_sec": 0 00:15:11.457 }, 00:15:11.457 "claimed": false, 00:15:11.457 "zoned": false, 00:15:11.457 "supported_io_types": { 00:15:11.457 "read": true, 00:15:11.457 "write": true, 00:15:11.457 "unmap": true, 00:15:11.457 "flush": true, 00:15:11.457 "reset": true, 00:15:11.457 "nvme_admin": false, 00:15:11.457 "nvme_io": false, 00:15:11.457 "nvme_io_md": false, 00:15:11.457 "write_zeroes": true, 00:15:11.457 "zcopy": false, 00:15:11.457 "get_zone_info": false, 00:15:11.457 "zone_management": false, 00:15:11.457 "zone_append": false, 00:15:11.457 "compare": false, 00:15:11.457 "compare_and_write": false, 00:15:11.457 "abort": false, 00:15:11.457 "seek_hole": false, 00:15:11.457 "seek_data": false, 00:15:11.457 "copy": false, 00:15:11.457 "nvme_iov_md": false 00:15:11.457 }, 00:15:11.457 "memory_domains": [ 00:15:11.457 { 00:15:11.457 "dma_device_id": "system", 00:15:11.457 "dma_device_type": 1 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.457 "dma_device_type": 2 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "dma_device_id": "system", 00:15:11.457 "dma_device_type": 1 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.457 "dma_device_type": 2 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "dma_device_id": "system", 00:15:11.457 "dma_device_type": 1 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.457 "dma_device_type": 2 00:15:11.457 } 00:15:11.457 ], 00:15:11.457 "driver_specific": { 00:15:11.457 "raid": { 00:15:11.457 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:11.457 "strip_size_kb": 64, 00:15:11.457 "state": "online", 00:15:11.457 "raid_level": "concat", 00:15:11.457 "superblock": true, 00:15:11.457 "num_base_bdevs": 3, 00:15:11.457 "num_base_bdevs_discovered": 3, 00:15:11.457 "num_base_bdevs_operational": 3, 00:15:11.457 "base_bdevs_list": [ 00:15:11.457 { 00:15:11.457 "name": "BaseBdev1", 00:15:11.457 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:11.457 "is_configured": true, 00:15:11.457 "data_offset": 2048, 00:15:11.457 "data_size": 63488 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "name": "BaseBdev2", 00:15:11.457 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:11.457 "is_configured": true, 00:15:11.457 "data_offset": 2048, 00:15:11.457 "data_size": 63488 00:15:11.457 }, 00:15:11.457 { 00:15:11.457 "name": "BaseBdev3", 00:15:11.457 "uuid": "0466c6ea-bc6f-472a-a866-e0c392df5ec9", 00:15:11.457 "is_configured": true, 00:15:11.457 "data_offset": 2048, 00:15:11.457 "data_size": 63488 00:15:11.457 } 00:15:11.457 ] 00:15:11.457 } 00:15:11.457 } 00:15:11.457 }' 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:11.457 BaseBdev2 00:15:11.457 BaseBdev3' 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:11.457 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.717 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.717 "name": "BaseBdev1", 00:15:11.717 "aliases": [ 00:15:11.717 "a3efdc28-7224-4aeb-976b-74fe2ec50323" 00:15:11.717 ], 00:15:11.717 "product_name": "Malloc disk", 00:15:11.717 "block_size": 512, 00:15:11.717 "num_blocks": 65536, 00:15:11.717 "uuid": "a3efdc28-7224-4aeb-976b-74fe2ec50323", 00:15:11.717 "assigned_rate_limits": { 00:15:11.717 "rw_ios_per_sec": 0, 00:15:11.717 "rw_mbytes_per_sec": 0, 00:15:11.717 "r_mbytes_per_sec": 0, 00:15:11.717 "w_mbytes_per_sec": 0 00:15:11.717 }, 00:15:11.717 "claimed": true, 00:15:11.717 "claim_type": "exclusive_write", 00:15:11.717 "zoned": false, 00:15:11.717 "supported_io_types": { 00:15:11.717 "read": true, 00:15:11.717 "write": true, 00:15:11.717 "unmap": true, 00:15:11.717 "flush": true, 00:15:11.717 "reset": true, 00:15:11.717 "nvme_admin": false, 00:15:11.717 "nvme_io": false, 00:15:11.717 "nvme_io_md": false, 00:15:11.717 "write_zeroes": true, 00:15:11.717 "zcopy": true, 00:15:11.717 "get_zone_info": false, 00:15:11.717 "zone_management": false, 00:15:11.717 "zone_append": false, 00:15:11.717 "compare": false, 00:15:11.717 "compare_and_write": false, 00:15:11.717 "abort": true, 00:15:11.717 "seek_hole": false, 00:15:11.717 "seek_data": false, 00:15:11.717 "copy": true, 00:15:11.717 "nvme_iov_md": false 00:15:11.717 }, 00:15:11.717 "memory_domains": [ 00:15:11.717 { 00:15:11.717 "dma_device_id": "system", 00:15:11.717 "dma_device_type": 1 00:15:11.717 }, 00:15:11.717 { 00:15:11.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.717 "dma_device_type": 2 00:15:11.717 } 00:15:11.717 ], 00:15:11.717 "driver_specific": {} 00:15:11.717 }' 00:15:11.717 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.717 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.717 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.717 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.974 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.233 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.233 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.233 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:12.233 09:19:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.491 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.491 "name": "BaseBdev2", 00:15:12.491 "aliases": [ 00:15:12.491 "8452022d-9972-4af6-9e98-3c54d687b973" 00:15:12.491 ], 00:15:12.491 "product_name": "Malloc disk", 00:15:12.491 "block_size": 512, 00:15:12.491 "num_blocks": 65536, 00:15:12.492 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:12.492 "assigned_rate_limits": { 00:15:12.492 "rw_ios_per_sec": 0, 00:15:12.492 "rw_mbytes_per_sec": 0, 00:15:12.492 "r_mbytes_per_sec": 0, 00:15:12.492 "w_mbytes_per_sec": 0 00:15:12.492 }, 00:15:12.492 "claimed": true, 00:15:12.492 "claim_type": "exclusive_write", 00:15:12.492 "zoned": false, 00:15:12.492 "supported_io_types": { 00:15:12.492 "read": true, 00:15:12.492 "write": true, 00:15:12.492 "unmap": true, 00:15:12.492 "flush": true, 00:15:12.492 "reset": true, 00:15:12.492 "nvme_admin": false, 00:15:12.492 "nvme_io": false, 00:15:12.492 "nvme_io_md": false, 00:15:12.492 "write_zeroes": true, 00:15:12.492 "zcopy": true, 00:15:12.492 "get_zone_info": false, 00:15:12.492 "zone_management": false, 00:15:12.492 "zone_append": false, 00:15:12.492 "compare": false, 00:15:12.492 "compare_and_write": false, 00:15:12.492 "abort": true, 00:15:12.492 "seek_hole": false, 00:15:12.492 "seek_data": false, 00:15:12.492 "copy": true, 00:15:12.492 "nvme_iov_md": false 00:15:12.492 }, 00:15:12.492 "memory_domains": [ 00:15:12.492 { 00:15:12.492 "dma_device_id": "system", 00:15:12.492 "dma_device_type": 1 00:15:12.492 }, 00:15:12.492 { 00:15:12.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.492 "dma_device_type": 2 00:15:12.492 } 00:15:12.492 ], 00:15:12.492 "driver_specific": {} 00:15:12.492 }' 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.492 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.750 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.750 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.750 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.750 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:12.750 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:13.009 "name": "BaseBdev3", 00:15:13.009 "aliases": [ 00:15:13.009 "0466c6ea-bc6f-472a-a866-e0c392df5ec9" 00:15:13.009 ], 00:15:13.009 "product_name": "Malloc disk", 00:15:13.009 "block_size": 512, 00:15:13.009 "num_blocks": 65536, 00:15:13.009 "uuid": "0466c6ea-bc6f-472a-a866-e0c392df5ec9", 00:15:13.009 "assigned_rate_limits": { 00:15:13.009 "rw_ios_per_sec": 0, 00:15:13.009 "rw_mbytes_per_sec": 0, 00:15:13.009 "r_mbytes_per_sec": 0, 00:15:13.009 "w_mbytes_per_sec": 0 00:15:13.009 }, 00:15:13.009 "claimed": true, 00:15:13.009 "claim_type": "exclusive_write", 00:15:13.009 "zoned": false, 00:15:13.009 "supported_io_types": { 00:15:13.009 "read": true, 00:15:13.009 "write": true, 00:15:13.009 "unmap": true, 00:15:13.009 "flush": true, 00:15:13.009 "reset": true, 00:15:13.009 "nvme_admin": false, 00:15:13.009 "nvme_io": false, 00:15:13.009 "nvme_io_md": false, 00:15:13.009 "write_zeroes": true, 00:15:13.009 "zcopy": true, 00:15:13.009 "get_zone_info": false, 00:15:13.009 "zone_management": false, 00:15:13.009 "zone_append": false, 00:15:13.009 "compare": false, 00:15:13.009 "compare_and_write": false, 00:15:13.009 "abort": true, 00:15:13.009 "seek_hole": false, 00:15:13.009 "seek_data": false, 00:15:13.009 "copy": true, 00:15:13.009 "nvme_iov_md": false 00:15:13.009 }, 00:15:13.009 "memory_domains": [ 00:15:13.009 { 00:15:13.009 "dma_device_id": "system", 00:15:13.009 "dma_device_type": 1 00:15:13.009 }, 00:15:13.009 { 00:15:13.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.009 "dma_device_type": 2 00:15:13.009 } 00:15:13.009 ], 00:15:13.009 "driver_specific": {} 00:15:13.009 }' 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:13.009 09:19:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:13.267 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:13.267 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:13.267 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:13.267 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:13.267 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:13.835 [2024-07-15 09:19:22.632117] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:13.835 [2024-07-15 09:19:22.632145] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:13.835 [2024-07-15 09:19:22.632185] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.835 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.092 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.092 "name": "Existed_Raid", 00:15:14.092 "uuid": "9d1b8a66-bfd3-4508-b25a-b2dee2011991", 00:15:14.092 "strip_size_kb": 64, 00:15:14.092 "state": "offline", 00:15:14.092 "raid_level": "concat", 00:15:14.092 "superblock": true, 00:15:14.092 "num_base_bdevs": 3, 00:15:14.092 "num_base_bdevs_discovered": 2, 00:15:14.092 "num_base_bdevs_operational": 2, 00:15:14.092 "base_bdevs_list": [ 00:15:14.092 { 00:15:14.092 "name": null, 00:15:14.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.092 "is_configured": false, 00:15:14.092 "data_offset": 2048, 00:15:14.092 "data_size": 63488 00:15:14.092 }, 00:15:14.092 { 00:15:14.092 "name": "BaseBdev2", 00:15:14.092 "uuid": "8452022d-9972-4af6-9e98-3c54d687b973", 00:15:14.092 "is_configured": true, 00:15:14.092 "data_offset": 2048, 00:15:14.092 "data_size": 63488 00:15:14.092 }, 00:15:14.092 { 00:15:14.092 "name": "BaseBdev3", 00:15:14.092 "uuid": "0466c6ea-bc6f-472a-a866-e0c392df5ec9", 00:15:14.092 "is_configured": true, 00:15:14.092 "data_offset": 2048, 00:15:14.092 "data_size": 63488 00:15:14.092 } 00:15:14.092 ] 00:15:14.092 }' 00:15:14.092 09:19:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.092 09:19:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.659 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:14.659 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.659 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:14.659 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.918 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:14.918 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:14.918 09:19:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:15.177 [2024-07-15 09:19:23.973566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:15.177 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:15.177 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:15.177 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.177 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:15.435 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:15.435 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:15.435 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:15.694 [2024-07-15 09:19:24.403061] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:15.694 [2024-07-15 09:19:24.403101] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b06400 name Existed_Raid, state offline 00:15:15.694 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:15.694 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:15.694 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.694 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:16.262 09:19:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:16.262 BaseBdev2 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:16.262 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.521 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:17.089 [ 00:15:17.089 { 00:15:17.089 "name": "BaseBdev2", 00:15:17.089 "aliases": [ 00:15:17.089 "d5ade617-dfd9-43f0-96a1-b60692cd9d35" 00:15:17.089 ], 00:15:17.089 "product_name": "Malloc disk", 00:15:17.089 "block_size": 512, 00:15:17.089 "num_blocks": 65536, 00:15:17.089 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:17.089 "assigned_rate_limits": { 00:15:17.089 "rw_ios_per_sec": 0, 00:15:17.089 "rw_mbytes_per_sec": 0, 00:15:17.089 "r_mbytes_per_sec": 0, 00:15:17.089 "w_mbytes_per_sec": 0 00:15:17.089 }, 00:15:17.089 "claimed": false, 00:15:17.089 "zoned": false, 00:15:17.089 "supported_io_types": { 00:15:17.089 "read": true, 00:15:17.089 "write": true, 00:15:17.089 "unmap": true, 00:15:17.089 "flush": true, 00:15:17.089 "reset": true, 00:15:17.089 "nvme_admin": false, 00:15:17.089 "nvme_io": false, 00:15:17.089 "nvme_io_md": false, 00:15:17.089 "write_zeroes": true, 00:15:17.089 "zcopy": true, 00:15:17.089 "get_zone_info": false, 00:15:17.089 "zone_management": false, 00:15:17.089 "zone_append": false, 00:15:17.089 "compare": false, 00:15:17.089 "compare_and_write": false, 00:15:17.089 "abort": true, 00:15:17.089 "seek_hole": false, 00:15:17.089 "seek_data": false, 00:15:17.089 "copy": true, 00:15:17.089 "nvme_iov_md": false 00:15:17.089 }, 00:15:17.089 "memory_domains": [ 00:15:17.089 { 00:15:17.089 "dma_device_id": "system", 00:15:17.089 "dma_device_type": 1 00:15:17.089 }, 00:15:17.089 { 00:15:17.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.089 "dma_device_type": 2 00:15:17.089 } 00:15:17.089 ], 00:15:17.090 "driver_specific": {} 00:15:17.090 } 00:15:17.090 ] 00:15:17.090 09:19:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:17.090 09:19:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:17.090 09:19:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:17.090 09:19:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:17.657 BaseBdev3 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:17.657 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.916 09:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:18.482 [ 00:15:18.482 { 00:15:18.482 "name": "BaseBdev3", 00:15:18.482 "aliases": [ 00:15:18.482 "861ca3f7-c8db-4578-be61-b10a153e2be4" 00:15:18.482 ], 00:15:18.482 "product_name": "Malloc disk", 00:15:18.482 "block_size": 512, 00:15:18.482 "num_blocks": 65536, 00:15:18.482 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:18.482 "assigned_rate_limits": { 00:15:18.482 "rw_ios_per_sec": 0, 00:15:18.482 "rw_mbytes_per_sec": 0, 00:15:18.482 "r_mbytes_per_sec": 0, 00:15:18.482 "w_mbytes_per_sec": 0 00:15:18.482 }, 00:15:18.482 "claimed": false, 00:15:18.482 "zoned": false, 00:15:18.482 "supported_io_types": { 00:15:18.482 "read": true, 00:15:18.482 "write": true, 00:15:18.482 "unmap": true, 00:15:18.482 "flush": true, 00:15:18.482 "reset": true, 00:15:18.482 "nvme_admin": false, 00:15:18.482 "nvme_io": false, 00:15:18.482 "nvme_io_md": false, 00:15:18.482 "write_zeroes": true, 00:15:18.482 "zcopy": true, 00:15:18.482 "get_zone_info": false, 00:15:18.482 "zone_management": false, 00:15:18.482 "zone_append": false, 00:15:18.482 "compare": false, 00:15:18.482 "compare_and_write": false, 00:15:18.482 "abort": true, 00:15:18.482 "seek_hole": false, 00:15:18.482 "seek_data": false, 00:15:18.482 "copy": true, 00:15:18.482 "nvme_iov_md": false 00:15:18.482 }, 00:15:18.482 "memory_domains": [ 00:15:18.482 { 00:15:18.482 "dma_device_id": "system", 00:15:18.482 "dma_device_type": 1 00:15:18.482 }, 00:15:18.482 { 00:15:18.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.482 "dma_device_type": 2 00:15:18.482 } 00:15:18.482 ], 00:15:18.482 "driver_specific": {} 00:15:18.482 } 00:15:18.482 ] 00:15:18.482 09:19:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:18.482 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:18.482 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:18.482 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:18.741 [2024-07-15 09:19:27.688795] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.741 [2024-07-15 09:19:27.688839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.741 [2024-07-15 09:19:27.688860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:18.741 [2024-07-15 09:19:27.690241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.010 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.268 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.268 "name": "Existed_Raid", 00:15:19.268 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:19.268 "strip_size_kb": 64, 00:15:19.268 "state": "configuring", 00:15:19.268 "raid_level": "concat", 00:15:19.268 "superblock": true, 00:15:19.268 "num_base_bdevs": 3, 00:15:19.268 "num_base_bdevs_discovered": 2, 00:15:19.268 "num_base_bdevs_operational": 3, 00:15:19.268 "base_bdevs_list": [ 00:15:19.268 { 00:15:19.268 "name": "BaseBdev1", 00:15:19.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.268 "is_configured": false, 00:15:19.268 "data_offset": 0, 00:15:19.269 "data_size": 0 00:15:19.269 }, 00:15:19.269 { 00:15:19.269 "name": "BaseBdev2", 00:15:19.269 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:19.269 "is_configured": true, 00:15:19.269 "data_offset": 2048, 00:15:19.269 "data_size": 63488 00:15:19.269 }, 00:15:19.269 { 00:15:19.269 "name": "BaseBdev3", 00:15:19.269 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:19.269 "is_configured": true, 00:15:19.269 "data_offset": 2048, 00:15:19.269 "data_size": 63488 00:15:19.269 } 00:15:19.269 ] 00:15:19.269 }' 00:15:19.269 09:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.269 09:19:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.836 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:19.836 [2024-07-15 09:19:28.787680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.095 09:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.663 09:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.663 "name": "Existed_Raid", 00:15:20.663 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:20.663 "strip_size_kb": 64, 00:15:20.663 "state": "configuring", 00:15:20.663 "raid_level": "concat", 00:15:20.663 "superblock": true, 00:15:20.663 "num_base_bdevs": 3, 00:15:20.663 "num_base_bdevs_discovered": 1, 00:15:20.663 "num_base_bdevs_operational": 3, 00:15:20.663 "base_bdevs_list": [ 00:15:20.663 { 00:15:20.663 "name": "BaseBdev1", 00:15:20.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.663 "is_configured": false, 00:15:20.663 "data_offset": 0, 00:15:20.663 "data_size": 0 00:15:20.663 }, 00:15:20.663 { 00:15:20.663 "name": null, 00:15:20.663 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:20.663 "is_configured": false, 00:15:20.663 "data_offset": 2048, 00:15:20.663 "data_size": 63488 00:15:20.663 }, 00:15:20.663 { 00:15:20.663 "name": "BaseBdev3", 00:15:20.663 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:20.663 "is_configured": true, 00:15:20.663 "data_offset": 2048, 00:15:20.663 "data_size": 63488 00:15:20.663 } 00:15:20.663 ] 00:15:20.663 }' 00:15:20.663 09:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.663 09:19:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.922 09:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.922 09:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:21.181 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:21.181 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:21.462 [2024-07-15 09:19:30.283094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:21.462 BaseBdev1 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:21.462 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:21.744 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:22.002 [ 00:15:22.002 { 00:15:22.002 "name": "BaseBdev1", 00:15:22.002 "aliases": [ 00:15:22.002 "15bb528d-9549-4c70-8be6-d3d37ca51b49" 00:15:22.002 ], 00:15:22.002 "product_name": "Malloc disk", 00:15:22.002 "block_size": 512, 00:15:22.002 "num_blocks": 65536, 00:15:22.002 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:22.002 "assigned_rate_limits": { 00:15:22.002 "rw_ios_per_sec": 0, 00:15:22.002 "rw_mbytes_per_sec": 0, 00:15:22.002 "r_mbytes_per_sec": 0, 00:15:22.002 "w_mbytes_per_sec": 0 00:15:22.002 }, 00:15:22.002 "claimed": true, 00:15:22.002 "claim_type": "exclusive_write", 00:15:22.002 "zoned": false, 00:15:22.002 "supported_io_types": { 00:15:22.002 "read": true, 00:15:22.002 "write": true, 00:15:22.002 "unmap": true, 00:15:22.002 "flush": true, 00:15:22.002 "reset": true, 00:15:22.002 "nvme_admin": false, 00:15:22.002 "nvme_io": false, 00:15:22.002 "nvme_io_md": false, 00:15:22.002 "write_zeroes": true, 00:15:22.002 "zcopy": true, 00:15:22.002 "get_zone_info": false, 00:15:22.002 "zone_management": false, 00:15:22.002 "zone_append": false, 00:15:22.002 "compare": false, 00:15:22.002 "compare_and_write": false, 00:15:22.002 "abort": true, 00:15:22.002 "seek_hole": false, 00:15:22.002 "seek_data": false, 00:15:22.002 "copy": true, 00:15:22.002 "nvme_iov_md": false 00:15:22.002 }, 00:15:22.002 "memory_domains": [ 00:15:22.002 { 00:15:22.002 "dma_device_id": "system", 00:15:22.002 "dma_device_type": 1 00:15:22.002 }, 00:15:22.002 { 00:15:22.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.002 "dma_device_type": 2 00:15:22.002 } 00:15:22.002 ], 00:15:22.002 "driver_specific": {} 00:15:22.002 } 00:15:22.002 ] 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.002 09:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.568 09:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.568 "name": "Existed_Raid", 00:15:22.568 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:22.568 "strip_size_kb": 64, 00:15:22.568 "state": "configuring", 00:15:22.569 "raid_level": "concat", 00:15:22.569 "superblock": true, 00:15:22.569 "num_base_bdevs": 3, 00:15:22.569 "num_base_bdevs_discovered": 2, 00:15:22.569 "num_base_bdevs_operational": 3, 00:15:22.569 "base_bdevs_list": [ 00:15:22.569 { 00:15:22.569 "name": "BaseBdev1", 00:15:22.569 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:22.569 "is_configured": true, 00:15:22.569 "data_offset": 2048, 00:15:22.569 "data_size": 63488 00:15:22.569 }, 00:15:22.569 { 00:15:22.569 "name": null, 00:15:22.569 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:22.569 "is_configured": false, 00:15:22.569 "data_offset": 2048, 00:15:22.569 "data_size": 63488 00:15:22.569 }, 00:15:22.569 { 00:15:22.569 "name": "BaseBdev3", 00:15:22.569 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:22.569 "is_configured": true, 00:15:22.569 "data_offset": 2048, 00:15:22.569 "data_size": 63488 00:15:22.569 } 00:15:22.569 ] 00:15:22.569 }' 00:15:22.569 09:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.569 09:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.135 09:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:23.135 09:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:23.393 [2024-07-15 09:19:32.324537] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.393 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.651 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.651 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.217 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.217 "name": "Existed_Raid", 00:15:24.217 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:24.217 "strip_size_kb": 64, 00:15:24.217 "state": "configuring", 00:15:24.217 "raid_level": "concat", 00:15:24.217 "superblock": true, 00:15:24.217 "num_base_bdevs": 3, 00:15:24.217 "num_base_bdevs_discovered": 1, 00:15:24.217 "num_base_bdevs_operational": 3, 00:15:24.217 "base_bdevs_list": [ 00:15:24.217 { 00:15:24.217 "name": "BaseBdev1", 00:15:24.217 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:24.217 "is_configured": true, 00:15:24.217 "data_offset": 2048, 00:15:24.217 "data_size": 63488 00:15:24.217 }, 00:15:24.217 { 00:15:24.217 "name": null, 00:15:24.217 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:24.217 "is_configured": false, 00:15:24.217 "data_offset": 2048, 00:15:24.217 "data_size": 63488 00:15:24.217 }, 00:15:24.217 { 00:15:24.217 "name": null, 00:15:24.217 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:24.217 "is_configured": false, 00:15:24.217 "data_offset": 2048, 00:15:24.217 "data_size": 63488 00:15:24.217 } 00:15:24.217 ] 00:15:24.217 }' 00:15:24.217 09:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.217 09:19:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.781 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.781 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:24.781 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:24.781 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:25.040 [2024-07-15 09:19:33.928808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.040 09:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.298 09:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.298 "name": "Existed_Raid", 00:15:25.298 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:25.298 "strip_size_kb": 64, 00:15:25.298 "state": "configuring", 00:15:25.298 "raid_level": "concat", 00:15:25.298 "superblock": true, 00:15:25.298 "num_base_bdevs": 3, 00:15:25.298 "num_base_bdevs_discovered": 2, 00:15:25.298 "num_base_bdevs_operational": 3, 00:15:25.298 "base_bdevs_list": [ 00:15:25.298 { 00:15:25.298 "name": "BaseBdev1", 00:15:25.298 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:25.298 "is_configured": true, 00:15:25.298 "data_offset": 2048, 00:15:25.298 "data_size": 63488 00:15:25.298 }, 00:15:25.298 { 00:15:25.298 "name": null, 00:15:25.298 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:25.298 "is_configured": false, 00:15:25.298 "data_offset": 2048, 00:15:25.298 "data_size": 63488 00:15:25.298 }, 00:15:25.298 { 00:15:25.298 "name": "BaseBdev3", 00:15:25.298 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:25.298 "is_configured": true, 00:15:25.298 "data_offset": 2048, 00:15:25.298 "data_size": 63488 00:15:25.298 } 00:15:25.298 ] 00:15:25.298 }' 00:15:25.298 09:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.298 09:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.261 09:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.261 09:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:26.261 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:26.261 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:26.519 [2024-07-15 09:19:35.432811] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.519 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.776 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.776 "name": "Existed_Raid", 00:15:26.776 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:26.776 "strip_size_kb": 64, 00:15:26.776 "state": "configuring", 00:15:26.776 "raid_level": "concat", 00:15:26.776 "superblock": true, 00:15:26.776 "num_base_bdevs": 3, 00:15:26.776 "num_base_bdevs_discovered": 1, 00:15:26.776 "num_base_bdevs_operational": 3, 00:15:26.776 "base_bdevs_list": [ 00:15:26.776 { 00:15:26.776 "name": null, 00:15:26.776 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:26.776 "is_configured": false, 00:15:26.776 "data_offset": 2048, 00:15:26.776 "data_size": 63488 00:15:26.776 }, 00:15:26.776 { 00:15:26.776 "name": null, 00:15:26.776 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:26.776 "is_configured": false, 00:15:26.776 "data_offset": 2048, 00:15:26.776 "data_size": 63488 00:15:26.776 }, 00:15:26.776 { 00:15:26.776 "name": "BaseBdev3", 00:15:26.776 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:26.776 "is_configured": true, 00:15:26.776 "data_offset": 2048, 00:15:26.776 "data_size": 63488 00:15:26.776 } 00:15:26.776 ] 00:15:26.776 }' 00:15:26.776 09:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.776 09:19:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.341 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:27.341 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.598 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:27.598 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:27.855 [2024-07-15 09:19:36.696525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.855 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.113 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.113 "name": "Existed_Raid", 00:15:28.113 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:28.113 "strip_size_kb": 64, 00:15:28.113 "state": "configuring", 00:15:28.113 "raid_level": "concat", 00:15:28.113 "superblock": true, 00:15:28.113 "num_base_bdevs": 3, 00:15:28.113 "num_base_bdevs_discovered": 2, 00:15:28.113 "num_base_bdevs_operational": 3, 00:15:28.113 "base_bdevs_list": [ 00:15:28.113 { 00:15:28.113 "name": null, 00:15:28.113 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:28.113 "is_configured": false, 00:15:28.113 "data_offset": 2048, 00:15:28.113 "data_size": 63488 00:15:28.113 }, 00:15:28.113 { 00:15:28.113 "name": "BaseBdev2", 00:15:28.113 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:28.113 "is_configured": true, 00:15:28.113 "data_offset": 2048, 00:15:28.113 "data_size": 63488 00:15:28.113 }, 00:15:28.113 { 00:15:28.113 "name": "BaseBdev3", 00:15:28.113 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:28.113 "is_configured": true, 00:15:28.113 "data_offset": 2048, 00:15:28.113 "data_size": 63488 00:15:28.113 } 00:15:28.113 ] 00:15:28.113 }' 00:15:28.113 09:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.113 09:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.679 09:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.679 09:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:28.936 09:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:28.936 09:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.936 09:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:29.193 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 15bb528d-9549-4c70-8be6-d3d37ca51b49 00:15:29.451 [2024-07-15 09:19:38.280164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:29.451 [2024-07-15 09:19:38.280317] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b04f50 00:15:29.451 [2024-07-15 09:19:38.280330] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:29.451 [2024-07-15 09:19:38.280502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x180b940 00:15:29.451 [2024-07-15 09:19:38.280621] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b04f50 00:15:29.451 [2024-07-15 09:19:38.280631] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b04f50 00:15:29.451 [2024-07-15 09:19:38.280723] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:29.451 NewBaseBdev 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.451 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.711 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:29.969 [ 00:15:29.969 { 00:15:29.969 "name": "NewBaseBdev", 00:15:29.969 "aliases": [ 00:15:29.969 "15bb528d-9549-4c70-8be6-d3d37ca51b49" 00:15:29.969 ], 00:15:29.969 "product_name": "Malloc disk", 00:15:29.969 "block_size": 512, 00:15:29.969 "num_blocks": 65536, 00:15:29.969 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:29.969 "assigned_rate_limits": { 00:15:29.969 "rw_ios_per_sec": 0, 00:15:29.969 "rw_mbytes_per_sec": 0, 00:15:29.969 "r_mbytes_per_sec": 0, 00:15:29.969 "w_mbytes_per_sec": 0 00:15:29.969 }, 00:15:29.969 "claimed": true, 00:15:29.969 "claim_type": "exclusive_write", 00:15:29.969 "zoned": false, 00:15:29.969 "supported_io_types": { 00:15:29.969 "read": true, 00:15:29.969 "write": true, 00:15:29.969 "unmap": true, 00:15:29.970 "flush": true, 00:15:29.970 "reset": true, 00:15:29.970 "nvme_admin": false, 00:15:29.970 "nvme_io": false, 00:15:29.970 "nvme_io_md": false, 00:15:29.970 "write_zeroes": true, 00:15:29.970 "zcopy": true, 00:15:29.970 "get_zone_info": false, 00:15:29.970 "zone_management": false, 00:15:29.970 "zone_append": false, 00:15:29.970 "compare": false, 00:15:29.970 "compare_and_write": false, 00:15:29.970 "abort": true, 00:15:29.970 "seek_hole": false, 00:15:29.970 "seek_data": false, 00:15:29.970 "copy": true, 00:15:29.970 "nvme_iov_md": false 00:15:29.970 }, 00:15:29.970 "memory_domains": [ 00:15:29.970 { 00:15:29.970 "dma_device_id": "system", 00:15:29.970 "dma_device_type": 1 00:15:29.970 }, 00:15:29.970 { 00:15:29.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.970 "dma_device_type": 2 00:15:29.970 } 00:15:29.970 ], 00:15:29.970 "driver_specific": {} 00:15:29.970 } 00:15:29.970 ] 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.970 09:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.536 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.536 "name": "Existed_Raid", 00:15:30.536 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:30.536 "strip_size_kb": 64, 00:15:30.536 "state": "online", 00:15:30.536 "raid_level": "concat", 00:15:30.536 "superblock": true, 00:15:30.536 "num_base_bdevs": 3, 00:15:30.536 "num_base_bdevs_discovered": 3, 00:15:30.536 "num_base_bdevs_operational": 3, 00:15:30.536 "base_bdevs_list": [ 00:15:30.536 { 00:15:30.536 "name": "NewBaseBdev", 00:15:30.536 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:30.536 "is_configured": true, 00:15:30.536 "data_offset": 2048, 00:15:30.536 "data_size": 63488 00:15:30.536 }, 00:15:30.536 { 00:15:30.536 "name": "BaseBdev2", 00:15:30.536 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:30.536 "is_configured": true, 00:15:30.536 "data_offset": 2048, 00:15:30.536 "data_size": 63488 00:15:30.536 }, 00:15:30.536 { 00:15:30.536 "name": "BaseBdev3", 00:15:30.536 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:30.536 "is_configured": true, 00:15:30.536 "data_offset": 2048, 00:15:30.536 "data_size": 63488 00:15:30.536 } 00:15:30.536 ] 00:15:30.536 }' 00:15:30.536 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.536 09:19:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:31.103 09:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:31.670 [2024-07-15 09:19:40.337975] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:31.670 "name": "Existed_Raid", 00:15:31.670 "aliases": [ 00:15:31.670 "e084ea9e-73f8-4f38-b90c-f9f939a00d6f" 00:15:31.670 ], 00:15:31.670 "product_name": "Raid Volume", 00:15:31.670 "block_size": 512, 00:15:31.670 "num_blocks": 190464, 00:15:31.670 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:31.670 "assigned_rate_limits": { 00:15:31.670 "rw_ios_per_sec": 0, 00:15:31.670 "rw_mbytes_per_sec": 0, 00:15:31.670 "r_mbytes_per_sec": 0, 00:15:31.670 "w_mbytes_per_sec": 0 00:15:31.670 }, 00:15:31.670 "claimed": false, 00:15:31.670 "zoned": false, 00:15:31.670 "supported_io_types": { 00:15:31.670 "read": true, 00:15:31.670 "write": true, 00:15:31.670 "unmap": true, 00:15:31.670 "flush": true, 00:15:31.670 "reset": true, 00:15:31.670 "nvme_admin": false, 00:15:31.670 "nvme_io": false, 00:15:31.670 "nvme_io_md": false, 00:15:31.670 "write_zeroes": true, 00:15:31.670 "zcopy": false, 00:15:31.670 "get_zone_info": false, 00:15:31.670 "zone_management": false, 00:15:31.670 "zone_append": false, 00:15:31.670 "compare": false, 00:15:31.670 "compare_and_write": false, 00:15:31.670 "abort": false, 00:15:31.670 "seek_hole": false, 00:15:31.670 "seek_data": false, 00:15:31.670 "copy": false, 00:15:31.670 "nvme_iov_md": false 00:15:31.670 }, 00:15:31.670 "memory_domains": [ 00:15:31.670 { 00:15:31.670 "dma_device_id": "system", 00:15:31.670 "dma_device_type": 1 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.670 "dma_device_type": 2 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "dma_device_id": "system", 00:15:31.670 "dma_device_type": 1 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.670 "dma_device_type": 2 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "dma_device_id": "system", 00:15:31.670 "dma_device_type": 1 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.670 "dma_device_type": 2 00:15:31.670 } 00:15:31.670 ], 00:15:31.670 "driver_specific": { 00:15:31.670 "raid": { 00:15:31.670 "uuid": "e084ea9e-73f8-4f38-b90c-f9f939a00d6f", 00:15:31.670 "strip_size_kb": 64, 00:15:31.670 "state": "online", 00:15:31.670 "raid_level": "concat", 00:15:31.670 "superblock": true, 00:15:31.670 "num_base_bdevs": 3, 00:15:31.670 "num_base_bdevs_discovered": 3, 00:15:31.670 "num_base_bdevs_operational": 3, 00:15:31.670 "base_bdevs_list": [ 00:15:31.670 { 00:15:31.670 "name": "NewBaseBdev", 00:15:31.670 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:31.670 "is_configured": true, 00:15:31.670 "data_offset": 2048, 00:15:31.670 "data_size": 63488 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "name": "BaseBdev2", 00:15:31.670 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:31.670 "is_configured": true, 00:15:31.670 "data_offset": 2048, 00:15:31.670 "data_size": 63488 00:15:31.670 }, 00:15:31.670 { 00:15:31.670 "name": "BaseBdev3", 00:15:31.670 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:31.670 "is_configured": true, 00:15:31.670 "data_offset": 2048, 00:15:31.670 "data_size": 63488 00:15:31.670 } 00:15:31.670 ] 00:15:31.670 } 00:15:31.670 } 00:15:31.670 }' 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:31.670 BaseBdev2 00:15:31.670 BaseBdev3' 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:31.670 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.928 "name": "NewBaseBdev", 00:15:31.928 "aliases": [ 00:15:31.928 "15bb528d-9549-4c70-8be6-d3d37ca51b49" 00:15:31.928 ], 00:15:31.928 "product_name": "Malloc disk", 00:15:31.928 "block_size": 512, 00:15:31.928 "num_blocks": 65536, 00:15:31.928 "uuid": "15bb528d-9549-4c70-8be6-d3d37ca51b49", 00:15:31.928 "assigned_rate_limits": { 00:15:31.928 "rw_ios_per_sec": 0, 00:15:31.928 "rw_mbytes_per_sec": 0, 00:15:31.928 "r_mbytes_per_sec": 0, 00:15:31.928 "w_mbytes_per_sec": 0 00:15:31.928 }, 00:15:31.928 "claimed": true, 00:15:31.928 "claim_type": "exclusive_write", 00:15:31.928 "zoned": false, 00:15:31.928 "supported_io_types": { 00:15:31.928 "read": true, 00:15:31.928 "write": true, 00:15:31.928 "unmap": true, 00:15:31.928 "flush": true, 00:15:31.928 "reset": true, 00:15:31.928 "nvme_admin": false, 00:15:31.928 "nvme_io": false, 00:15:31.928 "nvme_io_md": false, 00:15:31.928 "write_zeroes": true, 00:15:31.928 "zcopy": true, 00:15:31.928 "get_zone_info": false, 00:15:31.928 "zone_management": false, 00:15:31.928 "zone_append": false, 00:15:31.928 "compare": false, 00:15:31.928 "compare_and_write": false, 00:15:31.928 "abort": true, 00:15:31.928 "seek_hole": false, 00:15:31.928 "seek_data": false, 00:15:31.928 "copy": true, 00:15:31.928 "nvme_iov_md": false 00:15:31.928 }, 00:15:31.928 "memory_domains": [ 00:15:31.928 { 00:15:31.928 "dma_device_id": "system", 00:15:31.928 "dma_device_type": 1 00:15:31.928 }, 00:15:31.928 { 00:15:31.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.928 "dma_device_type": 2 00:15:31.928 } 00:15:31.928 ], 00:15:31.928 "driver_specific": {} 00:15:31.928 }' 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.928 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.186 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.186 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.186 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.186 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.186 09:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.186 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.186 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.186 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.186 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:32.186 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.443 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.443 "name": "BaseBdev2", 00:15:32.443 "aliases": [ 00:15:32.443 "d5ade617-dfd9-43f0-96a1-b60692cd9d35" 00:15:32.443 ], 00:15:32.443 "product_name": "Malloc disk", 00:15:32.443 "block_size": 512, 00:15:32.443 "num_blocks": 65536, 00:15:32.443 "uuid": "d5ade617-dfd9-43f0-96a1-b60692cd9d35", 00:15:32.443 "assigned_rate_limits": { 00:15:32.443 "rw_ios_per_sec": 0, 00:15:32.443 "rw_mbytes_per_sec": 0, 00:15:32.443 "r_mbytes_per_sec": 0, 00:15:32.443 "w_mbytes_per_sec": 0 00:15:32.443 }, 00:15:32.443 "claimed": true, 00:15:32.443 "claim_type": "exclusive_write", 00:15:32.443 "zoned": false, 00:15:32.443 "supported_io_types": { 00:15:32.443 "read": true, 00:15:32.443 "write": true, 00:15:32.443 "unmap": true, 00:15:32.443 "flush": true, 00:15:32.443 "reset": true, 00:15:32.443 "nvme_admin": false, 00:15:32.443 "nvme_io": false, 00:15:32.443 "nvme_io_md": false, 00:15:32.443 "write_zeroes": true, 00:15:32.443 "zcopy": true, 00:15:32.443 "get_zone_info": false, 00:15:32.443 "zone_management": false, 00:15:32.443 "zone_append": false, 00:15:32.443 "compare": false, 00:15:32.443 "compare_and_write": false, 00:15:32.443 "abort": true, 00:15:32.443 "seek_hole": false, 00:15:32.443 "seek_data": false, 00:15:32.443 "copy": true, 00:15:32.443 "nvme_iov_md": false 00:15:32.443 }, 00:15:32.443 "memory_domains": [ 00:15:32.443 { 00:15:32.443 "dma_device_id": "system", 00:15:32.443 "dma_device_type": 1 00:15:32.443 }, 00:15:32.443 { 00:15:32.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.443 "dma_device_type": 2 00:15:32.443 } 00:15:32.443 ], 00:15:32.443 "driver_specific": {} 00:15:32.443 }' 00:15:32.443 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.443 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.699 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.956 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.956 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.956 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.956 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:32.956 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.214 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.214 "name": "BaseBdev3", 00:15:33.214 "aliases": [ 00:15:33.214 "861ca3f7-c8db-4578-be61-b10a153e2be4" 00:15:33.214 ], 00:15:33.214 "product_name": "Malloc disk", 00:15:33.214 "block_size": 512, 00:15:33.214 "num_blocks": 65536, 00:15:33.214 "uuid": "861ca3f7-c8db-4578-be61-b10a153e2be4", 00:15:33.214 "assigned_rate_limits": { 00:15:33.214 "rw_ios_per_sec": 0, 00:15:33.214 "rw_mbytes_per_sec": 0, 00:15:33.214 "r_mbytes_per_sec": 0, 00:15:33.214 "w_mbytes_per_sec": 0 00:15:33.214 }, 00:15:33.214 "claimed": true, 00:15:33.214 "claim_type": "exclusive_write", 00:15:33.214 "zoned": false, 00:15:33.214 "supported_io_types": { 00:15:33.214 "read": true, 00:15:33.214 "write": true, 00:15:33.214 "unmap": true, 00:15:33.214 "flush": true, 00:15:33.214 "reset": true, 00:15:33.214 "nvme_admin": false, 00:15:33.214 "nvme_io": false, 00:15:33.214 "nvme_io_md": false, 00:15:33.214 "write_zeroes": true, 00:15:33.214 "zcopy": true, 00:15:33.214 "get_zone_info": false, 00:15:33.214 "zone_management": false, 00:15:33.214 "zone_append": false, 00:15:33.214 "compare": false, 00:15:33.214 "compare_and_write": false, 00:15:33.214 "abort": true, 00:15:33.214 "seek_hole": false, 00:15:33.214 "seek_data": false, 00:15:33.214 "copy": true, 00:15:33.214 "nvme_iov_md": false 00:15:33.214 }, 00:15:33.214 "memory_domains": [ 00:15:33.214 { 00:15:33.214 "dma_device_id": "system", 00:15:33.214 "dma_device_type": 1 00:15:33.214 }, 00:15:33.214 { 00:15:33.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.214 "dma_device_type": 2 00:15:33.214 } 00:15:33.214 ], 00:15:33.214 "driver_specific": {} 00:15:33.214 }' 00:15:33.214 09:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.214 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.214 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.214 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.214 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.473 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:33.730 [2024-07-15 09:19:42.587619] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:33.730 [2024-07-15 09:19:42.587646] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:33.730 [2024-07-15 09:19:42.587695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:33.730 [2024-07-15 09:19:42.587746] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:33.730 [2024-07-15 09:19:42.587758] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b04f50 name Existed_Raid, state offline 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 122134 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 122134 ']' 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 122134 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 122134 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 122134' 00:15:33.730 killing process with pid 122134 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 122134 00:15:33.730 [2024-07-15 09:19:42.652146] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:33.730 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 122134 00:15:33.730 [2024-07-15 09:19:42.678295] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:33.988 09:19:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:33.988 00:15:33.988 real 0m31.553s 00:15:33.988 user 0m58.012s 00:15:33.988 sys 0m5.382s 00:15:33.988 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:33.988 09:19:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.988 ************************************ 00:15:33.988 END TEST raid_state_function_test_sb 00:15:33.988 ************************************ 00:15:33.988 09:19:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:33.988 09:19:42 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:33.988 09:19:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:33.988 09:19:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:33.988 09:19:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:34.247 ************************************ 00:15:34.247 START TEST raid_superblock_test 00:15:34.247 ************************************ 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=126777 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 126777 /var/tmp/spdk-raid.sock 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 126777 ']' 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:34.247 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.247 09:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:34.247 [2024-07-15 09:19:43.016458] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:15:34.247 [2024-07-15 09:19:43.016525] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid126777 ] 00:15:34.247 [2024-07-15 09:19:43.144198] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:34.505 [2024-07-15 09:19:43.250043] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:34.505 [2024-07-15 09:19:43.318462] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:34.505 [2024-07-15 09:19:43.318500] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:35.070 09:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:35.635 malloc1 00:15:35.635 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:35.893 [2024-07-15 09:19:44.674106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:35.893 [2024-07-15 09:19:44.674154] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.893 [2024-07-15 09:19:44.674174] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b19570 00:15:35.893 [2024-07-15 09:19:44.674187] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.893 [2024-07-15 09:19:44.675916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.893 [2024-07-15 09:19:44.675953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:35.893 pt1 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:35.893 09:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:36.457 malloc2 00:15:36.457 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:36.715 [2024-07-15 09:19:45.430113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:36.715 [2024-07-15 09:19:45.430157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.715 [2024-07-15 09:19:45.430173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b1a970 00:15:36.715 [2024-07-15 09:19:45.430186] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.715 [2024-07-15 09:19:45.431809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.715 [2024-07-15 09:19:45.431837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:36.715 pt2 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.715 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:36.974 malloc3 00:15:36.974 09:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:37.232 [2024-07-15 09:19:46.168646] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:37.232 [2024-07-15 09:19:46.168693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.232 [2024-07-15 09:19:46.168711] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cb1340 00:15:37.232 [2024-07-15 09:19:46.168724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.232 [2024-07-15 09:19:46.170313] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.232 [2024-07-15 09:19:46.170342] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:37.232 pt3 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:37.490 [2024-07-15 09:19:46.417325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:37.490 [2024-07-15 09:19:46.418679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:37.490 [2024-07-15 09:19:46.418733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:37.490 [2024-07-15 09:19:46.418887] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b11ea0 00:15:37.490 [2024-07-15 09:19:46.418898] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:37.490 [2024-07-15 09:19:46.419107] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b19240 00:15:37.490 [2024-07-15 09:19:46.419254] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b11ea0 00:15:37.490 [2024-07-15 09:19:46.419264] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b11ea0 00:15:37.490 [2024-07-15 09:19:46.419362] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.490 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:37.748 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.748 "name": "raid_bdev1", 00:15:37.748 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:37.748 "strip_size_kb": 64, 00:15:37.748 "state": "online", 00:15:37.748 "raid_level": "concat", 00:15:37.748 "superblock": true, 00:15:37.748 "num_base_bdevs": 3, 00:15:37.748 "num_base_bdevs_discovered": 3, 00:15:37.748 "num_base_bdevs_operational": 3, 00:15:37.748 "base_bdevs_list": [ 00:15:37.748 { 00:15:37.748 "name": "pt1", 00:15:37.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.748 "is_configured": true, 00:15:37.748 "data_offset": 2048, 00:15:37.748 "data_size": 63488 00:15:37.748 }, 00:15:37.748 { 00:15:37.748 "name": "pt2", 00:15:37.748 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.748 "is_configured": true, 00:15:37.748 "data_offset": 2048, 00:15:37.748 "data_size": 63488 00:15:37.748 }, 00:15:37.748 { 00:15:37.748 "name": "pt3", 00:15:37.748 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:37.748 "is_configured": true, 00:15:37.748 "data_offset": 2048, 00:15:37.748 "data_size": 63488 00:15:37.748 } 00:15:37.748 ] 00:15:37.748 }' 00:15:37.748 09:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.748 09:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.712 [2024-07-15 09:19:47.508502] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.712 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.712 "name": "raid_bdev1", 00:15:38.712 "aliases": [ 00:15:38.712 "8a0bb2d8-1eca-4d9c-9b88-15763415d83f" 00:15:38.712 ], 00:15:38.712 "product_name": "Raid Volume", 00:15:38.712 "block_size": 512, 00:15:38.712 "num_blocks": 190464, 00:15:38.712 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:38.712 "assigned_rate_limits": { 00:15:38.712 "rw_ios_per_sec": 0, 00:15:38.712 "rw_mbytes_per_sec": 0, 00:15:38.712 "r_mbytes_per_sec": 0, 00:15:38.712 "w_mbytes_per_sec": 0 00:15:38.712 }, 00:15:38.712 "claimed": false, 00:15:38.712 "zoned": false, 00:15:38.712 "supported_io_types": { 00:15:38.712 "read": true, 00:15:38.712 "write": true, 00:15:38.712 "unmap": true, 00:15:38.712 "flush": true, 00:15:38.712 "reset": true, 00:15:38.712 "nvme_admin": false, 00:15:38.712 "nvme_io": false, 00:15:38.712 "nvme_io_md": false, 00:15:38.712 "write_zeroes": true, 00:15:38.712 "zcopy": false, 00:15:38.712 "get_zone_info": false, 00:15:38.712 "zone_management": false, 00:15:38.712 "zone_append": false, 00:15:38.712 "compare": false, 00:15:38.712 "compare_and_write": false, 00:15:38.712 "abort": false, 00:15:38.712 "seek_hole": false, 00:15:38.712 "seek_data": false, 00:15:38.712 "copy": false, 00:15:38.712 "nvme_iov_md": false 00:15:38.712 }, 00:15:38.712 "memory_domains": [ 00:15:38.712 { 00:15:38.712 "dma_device_id": "system", 00:15:38.712 "dma_device_type": 1 00:15:38.712 }, 00:15:38.712 { 00:15:38.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.712 "dma_device_type": 2 00:15:38.712 }, 00:15:38.712 { 00:15:38.713 "dma_device_id": "system", 00:15:38.713 "dma_device_type": 1 00:15:38.713 }, 00:15:38.713 { 00:15:38.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.713 "dma_device_type": 2 00:15:38.713 }, 00:15:38.713 { 00:15:38.713 "dma_device_id": "system", 00:15:38.713 "dma_device_type": 1 00:15:38.713 }, 00:15:38.713 { 00:15:38.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.713 "dma_device_type": 2 00:15:38.713 } 00:15:38.713 ], 00:15:38.713 "driver_specific": { 00:15:38.713 "raid": { 00:15:38.713 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:38.713 "strip_size_kb": 64, 00:15:38.713 "state": "online", 00:15:38.713 "raid_level": "concat", 00:15:38.713 "superblock": true, 00:15:38.713 "num_base_bdevs": 3, 00:15:38.713 "num_base_bdevs_discovered": 3, 00:15:38.713 "num_base_bdevs_operational": 3, 00:15:38.713 "base_bdevs_list": [ 00:15:38.713 { 00:15:38.713 "name": "pt1", 00:15:38.713 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.713 "is_configured": true, 00:15:38.713 "data_offset": 2048, 00:15:38.713 "data_size": 63488 00:15:38.713 }, 00:15:38.713 { 00:15:38.713 "name": "pt2", 00:15:38.713 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.713 "is_configured": true, 00:15:38.713 "data_offset": 2048, 00:15:38.713 "data_size": 63488 00:15:38.713 }, 00:15:38.713 { 00:15:38.713 "name": "pt3", 00:15:38.713 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.713 "is_configured": true, 00:15:38.713 "data_offset": 2048, 00:15:38.713 "data_size": 63488 00:15:38.713 } 00:15:38.713 ] 00:15:38.713 } 00:15:38.713 } 00:15:38.713 }' 00:15:38.713 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.713 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:38.713 pt2 00:15:38.713 pt3' 00:15:38.713 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.713 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:38.713 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.971 "name": "pt1", 00:15:38.971 "aliases": [ 00:15:38.971 "00000000-0000-0000-0000-000000000001" 00:15:38.971 ], 00:15:38.971 "product_name": "passthru", 00:15:38.971 "block_size": 512, 00:15:38.971 "num_blocks": 65536, 00:15:38.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.971 "assigned_rate_limits": { 00:15:38.971 "rw_ios_per_sec": 0, 00:15:38.971 "rw_mbytes_per_sec": 0, 00:15:38.971 "r_mbytes_per_sec": 0, 00:15:38.971 "w_mbytes_per_sec": 0 00:15:38.971 }, 00:15:38.971 "claimed": true, 00:15:38.971 "claim_type": "exclusive_write", 00:15:38.971 "zoned": false, 00:15:38.971 "supported_io_types": { 00:15:38.971 "read": true, 00:15:38.971 "write": true, 00:15:38.971 "unmap": true, 00:15:38.971 "flush": true, 00:15:38.971 "reset": true, 00:15:38.971 "nvme_admin": false, 00:15:38.971 "nvme_io": false, 00:15:38.971 "nvme_io_md": false, 00:15:38.971 "write_zeroes": true, 00:15:38.971 "zcopy": true, 00:15:38.971 "get_zone_info": false, 00:15:38.971 "zone_management": false, 00:15:38.971 "zone_append": false, 00:15:38.971 "compare": false, 00:15:38.971 "compare_and_write": false, 00:15:38.971 "abort": true, 00:15:38.971 "seek_hole": false, 00:15:38.971 "seek_data": false, 00:15:38.971 "copy": true, 00:15:38.971 "nvme_iov_md": false 00:15:38.971 }, 00:15:38.971 "memory_domains": [ 00:15:38.971 { 00:15:38.971 "dma_device_id": "system", 00:15:38.971 "dma_device_type": 1 00:15:38.971 }, 00:15:38.971 { 00:15:38.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.971 "dma_device_type": 2 00:15:38.971 } 00:15:38.971 ], 00:15:38.971 "driver_specific": { 00:15:38.971 "passthru": { 00:15:38.971 "name": "pt1", 00:15:38.971 "base_bdev_name": "malloc1" 00:15:38.971 } 00:15:38.971 } 00:15:38.971 }' 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.971 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.229 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.229 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.229 09:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.229 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:39.230 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.487 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.487 "name": "pt2", 00:15:39.487 "aliases": [ 00:15:39.487 "00000000-0000-0000-0000-000000000002" 00:15:39.487 ], 00:15:39.487 "product_name": "passthru", 00:15:39.487 "block_size": 512, 00:15:39.487 "num_blocks": 65536, 00:15:39.487 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.487 "assigned_rate_limits": { 00:15:39.487 "rw_ios_per_sec": 0, 00:15:39.487 "rw_mbytes_per_sec": 0, 00:15:39.487 "r_mbytes_per_sec": 0, 00:15:39.487 "w_mbytes_per_sec": 0 00:15:39.487 }, 00:15:39.487 "claimed": true, 00:15:39.487 "claim_type": "exclusive_write", 00:15:39.487 "zoned": false, 00:15:39.487 "supported_io_types": { 00:15:39.487 "read": true, 00:15:39.487 "write": true, 00:15:39.487 "unmap": true, 00:15:39.487 "flush": true, 00:15:39.487 "reset": true, 00:15:39.487 "nvme_admin": false, 00:15:39.487 "nvme_io": false, 00:15:39.487 "nvme_io_md": false, 00:15:39.487 "write_zeroes": true, 00:15:39.487 "zcopy": true, 00:15:39.487 "get_zone_info": false, 00:15:39.487 "zone_management": false, 00:15:39.487 "zone_append": false, 00:15:39.487 "compare": false, 00:15:39.487 "compare_and_write": false, 00:15:39.487 "abort": true, 00:15:39.487 "seek_hole": false, 00:15:39.487 "seek_data": false, 00:15:39.488 "copy": true, 00:15:39.488 "nvme_iov_md": false 00:15:39.488 }, 00:15:39.488 "memory_domains": [ 00:15:39.488 { 00:15:39.488 "dma_device_id": "system", 00:15:39.488 "dma_device_type": 1 00:15:39.488 }, 00:15:39.488 { 00:15:39.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.488 "dma_device_type": 2 00:15:39.488 } 00:15:39.488 ], 00:15:39.488 "driver_specific": { 00:15:39.488 "passthru": { 00:15:39.488 "name": "pt2", 00:15:39.488 "base_bdev_name": "malloc2" 00:15:39.488 } 00:15:39.488 } 00:15:39.488 }' 00:15:39.488 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.488 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.488 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.488 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:39.746 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.003 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.003 "name": "pt3", 00:15:40.003 "aliases": [ 00:15:40.003 "00000000-0000-0000-0000-000000000003" 00:15:40.003 ], 00:15:40.003 "product_name": "passthru", 00:15:40.003 "block_size": 512, 00:15:40.003 "num_blocks": 65536, 00:15:40.003 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.003 "assigned_rate_limits": { 00:15:40.003 "rw_ios_per_sec": 0, 00:15:40.003 "rw_mbytes_per_sec": 0, 00:15:40.003 "r_mbytes_per_sec": 0, 00:15:40.003 "w_mbytes_per_sec": 0 00:15:40.003 }, 00:15:40.003 "claimed": true, 00:15:40.003 "claim_type": "exclusive_write", 00:15:40.003 "zoned": false, 00:15:40.003 "supported_io_types": { 00:15:40.003 "read": true, 00:15:40.003 "write": true, 00:15:40.003 "unmap": true, 00:15:40.003 "flush": true, 00:15:40.003 "reset": true, 00:15:40.003 "nvme_admin": false, 00:15:40.003 "nvme_io": false, 00:15:40.003 "nvme_io_md": false, 00:15:40.003 "write_zeroes": true, 00:15:40.003 "zcopy": true, 00:15:40.003 "get_zone_info": false, 00:15:40.003 "zone_management": false, 00:15:40.003 "zone_append": false, 00:15:40.003 "compare": false, 00:15:40.003 "compare_and_write": false, 00:15:40.003 "abort": true, 00:15:40.003 "seek_hole": false, 00:15:40.003 "seek_data": false, 00:15:40.003 "copy": true, 00:15:40.003 "nvme_iov_md": false 00:15:40.003 }, 00:15:40.003 "memory_domains": [ 00:15:40.003 { 00:15:40.003 "dma_device_id": "system", 00:15:40.003 "dma_device_type": 1 00:15:40.003 }, 00:15:40.003 { 00:15:40.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.003 "dma_device_type": 2 00:15:40.003 } 00:15:40.003 ], 00:15:40.003 "driver_specific": { 00:15:40.003 "passthru": { 00:15:40.003 "name": "pt3", 00:15:40.003 "base_bdev_name": "malloc3" 00:15:40.003 } 00:15:40.003 } 00:15:40.003 }' 00:15:40.003 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.260 09:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.260 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.535 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.535 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.535 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:40.535 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:40.794 [2024-07-15 09:19:49.493759] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.794 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8a0bb2d8-1eca-4d9c-9b88-15763415d83f 00:15:40.794 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8a0bb2d8-1eca-4d9c-9b88-15763415d83f ']' 00:15:40.794 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:40.794 [2024-07-15 09:19:49.742131] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.794 [2024-07-15 09:19:49.742152] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.794 [2024-07-15 09:19:49.742199] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.794 [2024-07-15 09:19:49.742253] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.794 [2024-07-15 09:19:49.742265] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b11ea0 name raid_bdev1, state offline 00:15:41.053 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.053 09:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:41.311 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:41.311 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:41.312 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.312 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:41.312 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.312 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:41.571 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.571 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:41.830 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:41.830 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:42.088 09:19:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:42.346 [2024-07-15 09:19:51.101672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:42.346 [2024-07-15 09:19:51.103069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:42.346 [2024-07-15 09:19:51.103110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:42.346 [2024-07-15 09:19:51.103155] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:42.346 [2024-07-15 09:19:51.103194] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:42.346 [2024-07-15 09:19:51.103217] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:42.346 [2024-07-15 09:19:51.103236] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:42.346 [2024-07-15 09:19:51.103246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cbcff0 name raid_bdev1, state configuring 00:15:42.346 request: 00:15:42.346 { 00:15:42.346 "name": "raid_bdev1", 00:15:42.346 "raid_level": "concat", 00:15:42.346 "base_bdevs": [ 00:15:42.346 "malloc1", 00:15:42.346 "malloc2", 00:15:42.346 "malloc3" 00:15:42.346 ], 00:15:42.346 "strip_size_kb": 64, 00:15:42.346 "superblock": false, 00:15:42.346 "method": "bdev_raid_create", 00:15:42.346 "req_id": 1 00:15:42.346 } 00:15:42.346 Got JSON-RPC error response 00:15:42.346 response: 00:15:42.346 { 00:15:42.346 "code": -17, 00:15:42.346 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:42.346 } 00:15:42.346 09:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:42.346 09:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:42.347 09:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:42.347 09:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:42.347 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.347 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:42.605 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:42.605 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:42.605 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:42.864 [2024-07-15 09:19:51.590900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:42.864 [2024-07-15 09:19:51.590943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.864 [2024-07-15 09:19:51.590963] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b197a0 00:15:42.864 [2024-07-15 09:19:51.590975] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.864 [2024-07-15 09:19:51.592604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.864 [2024-07-15 09:19:51.592631] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:42.864 [2024-07-15 09:19:51.592702] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:42.864 [2024-07-15 09:19:51.592730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:42.864 pt1 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.864 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.122 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.122 "name": "raid_bdev1", 00:15:43.122 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:43.122 "strip_size_kb": 64, 00:15:43.122 "state": "configuring", 00:15:43.122 "raid_level": "concat", 00:15:43.122 "superblock": true, 00:15:43.122 "num_base_bdevs": 3, 00:15:43.122 "num_base_bdevs_discovered": 1, 00:15:43.122 "num_base_bdevs_operational": 3, 00:15:43.122 "base_bdevs_list": [ 00:15:43.122 { 00:15:43.122 "name": "pt1", 00:15:43.122 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.122 "is_configured": true, 00:15:43.122 "data_offset": 2048, 00:15:43.122 "data_size": 63488 00:15:43.122 }, 00:15:43.122 { 00:15:43.122 "name": null, 00:15:43.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.122 "is_configured": false, 00:15:43.122 "data_offset": 2048, 00:15:43.122 "data_size": 63488 00:15:43.122 }, 00:15:43.122 { 00:15:43.122 "name": null, 00:15:43.122 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.122 "is_configured": false, 00:15:43.122 "data_offset": 2048, 00:15:43.122 "data_size": 63488 00:15:43.122 } 00:15:43.122 ] 00:15:43.122 }' 00:15:43.122 09:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.122 09:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.687 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:43.687 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:43.945 [2024-07-15 09:19:52.681796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:43.945 [2024-07-15 09:19:52.681844] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.945 [2024-07-15 09:19:52.681861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b10c70 00:15:43.945 [2024-07-15 09:19:52.681873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.945 [2024-07-15 09:19:52.682216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.945 [2024-07-15 09:19:52.682233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:43.945 [2024-07-15 09:19:52.682294] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:43.945 [2024-07-15 09:19:52.682313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:43.945 pt2 00:15:43.945 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:44.203 [2024-07-15 09:19:52.926451] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.203 09:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.469 09:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.469 "name": "raid_bdev1", 00:15:44.469 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:44.469 "strip_size_kb": 64, 00:15:44.469 "state": "configuring", 00:15:44.469 "raid_level": "concat", 00:15:44.469 "superblock": true, 00:15:44.469 "num_base_bdevs": 3, 00:15:44.469 "num_base_bdevs_discovered": 1, 00:15:44.469 "num_base_bdevs_operational": 3, 00:15:44.469 "base_bdevs_list": [ 00:15:44.469 { 00:15:44.469 "name": "pt1", 00:15:44.469 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:44.469 "is_configured": true, 00:15:44.469 "data_offset": 2048, 00:15:44.470 "data_size": 63488 00:15:44.470 }, 00:15:44.470 { 00:15:44.470 "name": null, 00:15:44.470 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.470 "is_configured": false, 00:15:44.470 "data_offset": 2048, 00:15:44.470 "data_size": 63488 00:15:44.470 }, 00:15:44.470 { 00:15:44.470 "name": null, 00:15:44.470 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.470 "is_configured": false, 00:15:44.470 "data_offset": 2048, 00:15:44.470 "data_size": 63488 00:15:44.470 } 00:15:44.470 ] 00:15:44.470 }' 00:15:44.470 09:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.470 09:19:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.036 09:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:45.036 09:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.036 09:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:45.294 [2024-07-15 09:19:54.021347] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:45.294 [2024-07-15 09:19:54.021393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.294 [2024-07-15 09:19:54.021413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b19a10 00:15:45.294 [2024-07-15 09:19:54.021426] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.294 [2024-07-15 09:19:54.021770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.294 [2024-07-15 09:19:54.021787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:45.294 [2024-07-15 09:19:54.021849] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:45.294 [2024-07-15 09:19:54.021868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:45.294 pt2 00:15:45.294 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.294 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.294 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:45.552 [2024-07-15 09:19:54.266009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:45.552 [2024-07-15 09:19:54.266051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.552 [2024-07-15 09:19:54.266069] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cb3740 00:15:45.552 [2024-07-15 09:19:54.266082] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.552 [2024-07-15 09:19:54.266412] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.552 [2024-07-15 09:19:54.266430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:45.552 [2024-07-15 09:19:54.266492] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:45.552 [2024-07-15 09:19:54.266512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:45.552 [2024-07-15 09:19:54.266623] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cb3c00 00:15:45.552 [2024-07-15 09:19:54.266633] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:45.552 [2024-07-15 09:19:54.266799] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b18a40 00:15:45.552 [2024-07-15 09:19:54.266936] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cb3c00 00:15:45.552 [2024-07-15 09:19:54.266946] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cb3c00 00:15:45.552 [2024-07-15 09:19:54.267044] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.552 pt3 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.552 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.811 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.811 "name": "raid_bdev1", 00:15:45.811 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:45.811 "strip_size_kb": 64, 00:15:45.811 "state": "online", 00:15:45.811 "raid_level": "concat", 00:15:45.811 "superblock": true, 00:15:45.811 "num_base_bdevs": 3, 00:15:45.811 "num_base_bdevs_discovered": 3, 00:15:45.811 "num_base_bdevs_operational": 3, 00:15:45.811 "base_bdevs_list": [ 00:15:45.811 { 00:15:45.811 "name": "pt1", 00:15:45.811 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:45.811 "is_configured": true, 00:15:45.811 "data_offset": 2048, 00:15:45.811 "data_size": 63488 00:15:45.811 }, 00:15:45.811 { 00:15:45.811 "name": "pt2", 00:15:45.811 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.811 "is_configured": true, 00:15:45.811 "data_offset": 2048, 00:15:45.811 "data_size": 63488 00:15:45.811 }, 00:15:45.811 { 00:15:45.811 "name": "pt3", 00:15:45.811 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.811 "is_configured": true, 00:15:45.811 "data_offset": 2048, 00:15:45.811 "data_size": 63488 00:15:45.811 } 00:15:45.811 ] 00:15:45.811 }' 00:15:45.811 09:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.811 09:19:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:46.377 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:46.636 [2024-07-15 09:19:55.349270] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:46.636 "name": "raid_bdev1", 00:15:46.636 "aliases": [ 00:15:46.636 "8a0bb2d8-1eca-4d9c-9b88-15763415d83f" 00:15:46.636 ], 00:15:46.636 "product_name": "Raid Volume", 00:15:46.636 "block_size": 512, 00:15:46.636 "num_blocks": 190464, 00:15:46.636 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:46.636 "assigned_rate_limits": { 00:15:46.636 "rw_ios_per_sec": 0, 00:15:46.636 "rw_mbytes_per_sec": 0, 00:15:46.636 "r_mbytes_per_sec": 0, 00:15:46.636 "w_mbytes_per_sec": 0 00:15:46.636 }, 00:15:46.636 "claimed": false, 00:15:46.636 "zoned": false, 00:15:46.636 "supported_io_types": { 00:15:46.636 "read": true, 00:15:46.636 "write": true, 00:15:46.636 "unmap": true, 00:15:46.636 "flush": true, 00:15:46.636 "reset": true, 00:15:46.636 "nvme_admin": false, 00:15:46.636 "nvme_io": false, 00:15:46.636 "nvme_io_md": false, 00:15:46.636 "write_zeroes": true, 00:15:46.636 "zcopy": false, 00:15:46.636 "get_zone_info": false, 00:15:46.636 "zone_management": false, 00:15:46.636 "zone_append": false, 00:15:46.636 "compare": false, 00:15:46.636 "compare_and_write": false, 00:15:46.636 "abort": false, 00:15:46.636 "seek_hole": false, 00:15:46.636 "seek_data": false, 00:15:46.636 "copy": false, 00:15:46.636 "nvme_iov_md": false 00:15:46.636 }, 00:15:46.636 "memory_domains": [ 00:15:46.636 { 00:15:46.636 "dma_device_id": "system", 00:15:46.636 "dma_device_type": 1 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.636 "dma_device_type": 2 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "dma_device_id": "system", 00:15:46.636 "dma_device_type": 1 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.636 "dma_device_type": 2 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "dma_device_id": "system", 00:15:46.636 "dma_device_type": 1 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.636 "dma_device_type": 2 00:15:46.636 } 00:15:46.636 ], 00:15:46.636 "driver_specific": { 00:15:46.636 "raid": { 00:15:46.636 "uuid": "8a0bb2d8-1eca-4d9c-9b88-15763415d83f", 00:15:46.636 "strip_size_kb": 64, 00:15:46.636 "state": "online", 00:15:46.636 "raid_level": "concat", 00:15:46.636 "superblock": true, 00:15:46.636 "num_base_bdevs": 3, 00:15:46.636 "num_base_bdevs_discovered": 3, 00:15:46.636 "num_base_bdevs_operational": 3, 00:15:46.636 "base_bdevs_list": [ 00:15:46.636 { 00:15:46.636 "name": "pt1", 00:15:46.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.636 "is_configured": true, 00:15:46.636 "data_offset": 2048, 00:15:46.636 "data_size": 63488 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "name": "pt2", 00:15:46.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:46.636 "is_configured": true, 00:15:46.636 "data_offset": 2048, 00:15:46.636 "data_size": 63488 00:15:46.636 }, 00:15:46.636 { 00:15:46.636 "name": "pt3", 00:15:46.636 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:46.636 "is_configured": true, 00:15:46.636 "data_offset": 2048, 00:15:46.636 "data_size": 63488 00:15:46.636 } 00:15:46.636 ] 00:15:46.636 } 00:15:46.636 } 00:15:46.636 }' 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:46.636 pt2 00:15:46.636 pt3' 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:46.636 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.895 "name": "pt1", 00:15:46.895 "aliases": [ 00:15:46.895 "00000000-0000-0000-0000-000000000001" 00:15:46.895 ], 00:15:46.895 "product_name": "passthru", 00:15:46.895 "block_size": 512, 00:15:46.895 "num_blocks": 65536, 00:15:46.895 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.895 "assigned_rate_limits": { 00:15:46.895 "rw_ios_per_sec": 0, 00:15:46.895 "rw_mbytes_per_sec": 0, 00:15:46.895 "r_mbytes_per_sec": 0, 00:15:46.895 "w_mbytes_per_sec": 0 00:15:46.895 }, 00:15:46.895 "claimed": true, 00:15:46.895 "claim_type": "exclusive_write", 00:15:46.895 "zoned": false, 00:15:46.895 "supported_io_types": { 00:15:46.895 "read": true, 00:15:46.895 "write": true, 00:15:46.895 "unmap": true, 00:15:46.895 "flush": true, 00:15:46.895 "reset": true, 00:15:46.895 "nvme_admin": false, 00:15:46.895 "nvme_io": false, 00:15:46.895 "nvme_io_md": false, 00:15:46.895 "write_zeroes": true, 00:15:46.895 "zcopy": true, 00:15:46.895 "get_zone_info": false, 00:15:46.895 "zone_management": false, 00:15:46.895 "zone_append": false, 00:15:46.895 "compare": false, 00:15:46.895 "compare_and_write": false, 00:15:46.895 "abort": true, 00:15:46.895 "seek_hole": false, 00:15:46.895 "seek_data": false, 00:15:46.895 "copy": true, 00:15:46.895 "nvme_iov_md": false 00:15:46.895 }, 00:15:46.895 "memory_domains": [ 00:15:46.895 { 00:15:46.895 "dma_device_id": "system", 00:15:46.895 "dma_device_type": 1 00:15:46.895 }, 00:15:46.895 { 00:15:46.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.895 "dma_device_type": 2 00:15:46.895 } 00:15:46.895 ], 00:15:46.895 "driver_specific": { 00:15:46.895 "passthru": { 00:15:46.895 "name": "pt1", 00:15:46.895 "base_bdev_name": "malloc1" 00:15:46.895 } 00:15:46.895 } 00:15:46.895 }' 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.895 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:47.153 09:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.412 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.412 "name": "pt2", 00:15:47.412 "aliases": [ 00:15:47.412 "00000000-0000-0000-0000-000000000002" 00:15:47.412 ], 00:15:47.412 "product_name": "passthru", 00:15:47.412 "block_size": 512, 00:15:47.412 "num_blocks": 65536, 00:15:47.412 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:47.412 "assigned_rate_limits": { 00:15:47.412 "rw_ios_per_sec": 0, 00:15:47.412 "rw_mbytes_per_sec": 0, 00:15:47.412 "r_mbytes_per_sec": 0, 00:15:47.412 "w_mbytes_per_sec": 0 00:15:47.412 }, 00:15:47.412 "claimed": true, 00:15:47.412 "claim_type": "exclusive_write", 00:15:47.412 "zoned": false, 00:15:47.412 "supported_io_types": { 00:15:47.412 "read": true, 00:15:47.412 "write": true, 00:15:47.412 "unmap": true, 00:15:47.412 "flush": true, 00:15:47.412 "reset": true, 00:15:47.412 "nvme_admin": false, 00:15:47.412 "nvme_io": false, 00:15:47.412 "nvme_io_md": false, 00:15:47.412 "write_zeroes": true, 00:15:47.412 "zcopy": true, 00:15:47.412 "get_zone_info": false, 00:15:47.412 "zone_management": false, 00:15:47.412 "zone_append": false, 00:15:47.412 "compare": false, 00:15:47.412 "compare_and_write": false, 00:15:47.412 "abort": true, 00:15:47.412 "seek_hole": false, 00:15:47.412 "seek_data": false, 00:15:47.412 "copy": true, 00:15:47.412 "nvme_iov_md": false 00:15:47.412 }, 00:15:47.412 "memory_domains": [ 00:15:47.412 { 00:15:47.412 "dma_device_id": "system", 00:15:47.412 "dma_device_type": 1 00:15:47.412 }, 00:15:47.412 { 00:15:47.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.412 "dma_device_type": 2 00:15:47.412 } 00:15:47.412 ], 00:15:47.412 "driver_specific": { 00:15:47.412 "passthru": { 00:15:47.412 "name": "pt2", 00:15:47.412 "base_bdev_name": "malloc2" 00:15:47.412 } 00:15:47.412 } 00:15:47.412 }' 00:15:47.412 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.412 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.412 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.412 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:47.669 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.925 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.925 "name": "pt3", 00:15:47.925 "aliases": [ 00:15:47.925 "00000000-0000-0000-0000-000000000003" 00:15:47.925 ], 00:15:47.925 "product_name": "passthru", 00:15:47.925 "block_size": 512, 00:15:47.925 "num_blocks": 65536, 00:15:47.925 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:47.925 "assigned_rate_limits": { 00:15:47.925 "rw_ios_per_sec": 0, 00:15:47.925 "rw_mbytes_per_sec": 0, 00:15:47.925 "r_mbytes_per_sec": 0, 00:15:47.925 "w_mbytes_per_sec": 0 00:15:47.925 }, 00:15:47.925 "claimed": true, 00:15:47.925 "claim_type": "exclusive_write", 00:15:47.925 "zoned": false, 00:15:47.925 "supported_io_types": { 00:15:47.925 "read": true, 00:15:47.925 "write": true, 00:15:47.925 "unmap": true, 00:15:47.925 "flush": true, 00:15:47.925 "reset": true, 00:15:47.925 "nvme_admin": false, 00:15:47.925 "nvme_io": false, 00:15:47.925 "nvme_io_md": false, 00:15:47.925 "write_zeroes": true, 00:15:47.925 "zcopy": true, 00:15:47.925 "get_zone_info": false, 00:15:47.925 "zone_management": false, 00:15:47.925 "zone_append": false, 00:15:47.925 "compare": false, 00:15:47.925 "compare_and_write": false, 00:15:47.925 "abort": true, 00:15:47.925 "seek_hole": false, 00:15:47.925 "seek_data": false, 00:15:47.925 "copy": true, 00:15:47.925 "nvme_iov_md": false 00:15:47.925 }, 00:15:47.925 "memory_domains": [ 00:15:47.925 { 00:15:47.925 "dma_device_id": "system", 00:15:47.925 "dma_device_type": 1 00:15:47.925 }, 00:15:47.925 { 00:15:47.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.925 "dma_device_type": 2 00:15:47.925 } 00:15:47.925 ], 00:15:47.925 "driver_specific": { 00:15:47.925 "passthru": { 00:15:47.925 "name": "pt3", 00:15:47.925 "base_bdev_name": "malloc3" 00:15:47.925 } 00:15:47.925 } 00:15:47.926 }' 00:15:47.926 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:48.182 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:48.182 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:48.182 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.182 09:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.182 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:48.182 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.182 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.182 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:48.182 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.440 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.440 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:48.440 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:48.440 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:48.698 [2024-07-15 09:19:57.422807] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8a0bb2d8-1eca-4d9c-9b88-15763415d83f '!=' 8a0bb2d8-1eca-4d9c-9b88-15763415d83f ']' 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 126777 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 126777 ']' 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 126777 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 126777 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 126777' 00:15:48.698 killing process with pid 126777 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 126777 00:15:48.698 [2024-07-15 09:19:57.494673] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:48.698 [2024-07-15 09:19:57.494739] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.698 [2024-07-15 09:19:57.494797] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:48.698 [2024-07-15 09:19:57.494812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb3c00 name raid_bdev1, state offline 00:15:48.698 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 126777 00:15:48.698 [2024-07-15 09:19:57.525639] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:48.957 09:19:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:48.957 00:15:48.957 real 0m14.794s 00:15:48.957 user 0m26.581s 00:15:48.957 sys 0m2.716s 00:15:48.957 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:48.957 09:19:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.957 ************************************ 00:15:48.957 END TEST raid_superblock_test 00:15:48.957 ************************************ 00:15:48.957 09:19:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:48.957 09:19:57 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:48.957 09:19:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:48.957 09:19:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:48.957 09:19:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:48.957 ************************************ 00:15:48.957 START TEST raid_read_error_test 00:15:48.957 ************************************ 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GsG8uyrsap 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=129003 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 129003 /var/tmp/spdk-raid.sock 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 129003 ']' 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:48.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:48.957 09:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.216 [2024-07-15 09:19:57.915828] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:15:49.216 [2024-07-15 09:19:57.915899] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid129003 ] 00:15:49.216 [2024-07-15 09:19:58.047137] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:49.216 [2024-07-15 09:19:58.151206] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:49.473 [2024-07-15 09:19:58.213054] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:49.473 [2024-07-15 09:19:58.213088] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:50.037 09:19:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:50.037 09:19:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:50.037 09:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.037 09:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:50.293 BaseBdev1_malloc 00:15:50.293 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:50.293 true 00:15:50.294 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:50.552 [2024-07-15 09:19:59.446140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:50.552 [2024-07-15 09:19:59.446186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.552 [2024-07-15 09:19:59.446206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16700d0 00:15:50.552 [2024-07-15 09:19:59.446219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.552 [2024-07-15 09:19:59.447939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.552 [2024-07-15 09:19:59.447968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:50.552 BaseBdev1 00:15:50.552 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.552 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:50.810 BaseBdev2_malloc 00:15:50.810 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:51.068 true 00:15:51.068 09:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:51.326 [2024-07-15 09:20:00.144677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:51.326 [2024-07-15 09:20:00.144721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.326 [2024-07-15 09:20:00.144743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1674910 00:15:51.326 [2024-07-15 09:20:00.144756] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.326 [2024-07-15 09:20:00.146306] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.326 [2024-07-15 09:20:00.146336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:51.326 BaseBdev2 00:15:51.326 09:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:51.326 09:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:51.621 BaseBdev3_malloc 00:15:51.621 09:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:51.880 true 00:15:51.880 09:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:51.880 [2024-07-15 09:20:00.823146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:51.880 [2024-07-15 09:20:00.823192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.880 [2024-07-15 09:20:00.823211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1676bd0 00:15:51.880 [2024-07-15 09:20:00.823223] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.880 [2024-07-15 09:20:00.824598] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.880 [2024-07-15 09:20:00.824624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:51.880 BaseBdev3 00:15:52.138 09:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:52.138 [2024-07-15 09:20:01.071836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:52.138 [2024-07-15 09:20:01.073050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.138 [2024-07-15 09:20:01.073119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:52.138 [2024-07-15 09:20:01.073325] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1678280 00:15:52.138 [2024-07-15 09:20:01.073337] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:52.138 [2024-07-15 09:20:01.073513] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1677e20 00:15:52.138 [2024-07-15 09:20:01.073652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1678280 00:15:52.138 [2024-07-15 09:20:01.073662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1678280 00:15:52.138 [2024-07-15 09:20:01.073755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:52.396 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.396 "name": "raid_bdev1", 00:15:52.396 "uuid": "3c7f418c-02fa-4ded-a1a6-1660695151cb", 00:15:52.396 "strip_size_kb": 64, 00:15:52.396 "state": "online", 00:15:52.396 "raid_level": "concat", 00:15:52.396 "superblock": true, 00:15:52.396 "num_base_bdevs": 3, 00:15:52.396 "num_base_bdevs_discovered": 3, 00:15:52.396 "num_base_bdevs_operational": 3, 00:15:52.396 "base_bdevs_list": [ 00:15:52.396 { 00:15:52.396 "name": "BaseBdev1", 00:15:52.396 "uuid": "b3ce12ae-4216-59f4-9f10-8b034f36b7c7", 00:15:52.396 "is_configured": true, 00:15:52.396 "data_offset": 2048, 00:15:52.396 "data_size": 63488 00:15:52.396 }, 00:15:52.396 { 00:15:52.396 "name": "BaseBdev2", 00:15:52.396 "uuid": "9bb2dcb3-5a4b-56bd-8a3a-dcf5fd2e62f2", 00:15:52.396 "is_configured": true, 00:15:52.396 "data_offset": 2048, 00:15:52.396 "data_size": 63488 00:15:52.396 }, 00:15:52.396 { 00:15:52.396 "name": "BaseBdev3", 00:15:52.396 "uuid": "628ab315-1b1a-5158-9603-1ef28b79b687", 00:15:52.396 "is_configured": true, 00:15:52.397 "data_offset": 2048, 00:15:52.397 "data_size": 63488 00:15:52.397 } 00:15:52.397 ] 00:15:52.397 }' 00:15:52.397 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.397 09:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.962 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:52.962 09:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:53.220 [2024-07-15 09:20:01.926383] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c64d0 00:15:54.154 09:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:54.154 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.155 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.413 "name": "raid_bdev1", 00:15:54.413 "uuid": "3c7f418c-02fa-4ded-a1a6-1660695151cb", 00:15:54.413 "strip_size_kb": 64, 00:15:54.413 "state": "online", 00:15:54.413 "raid_level": "concat", 00:15:54.413 "superblock": true, 00:15:54.413 "num_base_bdevs": 3, 00:15:54.413 "num_base_bdevs_discovered": 3, 00:15:54.413 "num_base_bdevs_operational": 3, 00:15:54.413 "base_bdevs_list": [ 00:15:54.413 { 00:15:54.413 "name": "BaseBdev1", 00:15:54.413 "uuid": "b3ce12ae-4216-59f4-9f10-8b034f36b7c7", 00:15:54.413 "is_configured": true, 00:15:54.413 "data_offset": 2048, 00:15:54.413 "data_size": 63488 00:15:54.413 }, 00:15:54.413 { 00:15:54.413 "name": "BaseBdev2", 00:15:54.413 "uuid": "9bb2dcb3-5a4b-56bd-8a3a-dcf5fd2e62f2", 00:15:54.413 "is_configured": true, 00:15:54.413 "data_offset": 2048, 00:15:54.413 "data_size": 63488 00:15:54.413 }, 00:15:54.413 { 00:15:54.413 "name": "BaseBdev3", 00:15:54.413 "uuid": "628ab315-1b1a-5158-9603-1ef28b79b687", 00:15:54.413 "is_configured": true, 00:15:54.413 "data_offset": 2048, 00:15:54.413 "data_size": 63488 00:15:54.413 } 00:15:54.413 ] 00:15:54.413 }' 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.413 09:20:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.979 09:20:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:55.237 [2024-07-15 09:20:04.107541] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:55.237 [2024-07-15 09:20:04.107580] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.237 [2024-07-15 09:20:04.110770] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.237 [2024-07-15 09:20:04.110808] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:55.237 [2024-07-15 09:20:04.110842] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:55.237 [2024-07-15 09:20:04.110855] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1678280 name raid_bdev1, state offline 00:15:55.237 0 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 129003 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 129003 ']' 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 129003 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 129003 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 129003' 00:15:55.237 killing process with pid 129003 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 129003 00:15:55.237 [2024-07-15 09:20:04.175692] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:55.237 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 129003 00:15:55.495 [2024-07-15 09:20:04.196404] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GsG8uyrsap 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:55.495 00:15:55.495 real 0m6.596s 00:15:55.495 user 0m10.270s 00:15:55.495 sys 0m1.213s 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:55.495 09:20:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.495 ************************************ 00:15:55.495 END TEST raid_read_error_test 00:15:55.495 ************************************ 00:15:55.753 09:20:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:55.753 09:20:04 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:55.753 09:20:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:55.753 09:20:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:55.753 09:20:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:55.753 ************************************ 00:15:55.753 START TEST raid_write_error_test 00:15:55.753 ************************************ 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.68CDVyLi8p 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=129985 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 129985 /var/tmp/spdk-raid.sock 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 129985 ']' 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:55.753 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:55.753 09:20:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.753 [2024-07-15 09:20:04.599367] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:15:55.753 [2024-07-15 09:20:04.599437] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid129985 ] 00:15:56.011 [2024-07-15 09:20:04.729115] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:56.011 [2024-07-15 09:20:04.834621] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:56.011 [2024-07-15 09:20:04.895413] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.011 [2024-07-15 09:20:04.895452] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.577 09:20:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:56.577 09:20:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:56.577 09:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.577 09:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:56.835 BaseBdev1_malloc 00:15:56.835 09:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:57.093 true 00:15:57.093 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:57.351 [2024-07-15 09:20:06.232779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:57.351 [2024-07-15 09:20:06.232826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.351 [2024-07-15 09:20:06.232846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e40d0 00:15:57.351 [2024-07-15 09:20:06.232860] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.351 [2024-07-15 09:20:06.234750] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.351 [2024-07-15 09:20:06.234781] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:57.351 BaseBdev1 00:15:57.351 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:57.352 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:57.610 BaseBdev2_malloc 00:15:57.610 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:57.868 true 00:15:57.868 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:58.126 [2024-07-15 09:20:06.972513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:58.126 [2024-07-15 09:20:06.972557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.126 [2024-07-15 09:20:06.972578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e8910 00:15:58.126 [2024-07-15 09:20:06.972590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.126 [2024-07-15 09:20:06.974137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.126 [2024-07-15 09:20:06.974165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:58.126 BaseBdev2 00:15:58.126 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:58.126 09:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:58.385 BaseBdev3_malloc 00:15:58.385 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:58.643 true 00:15:58.643 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:58.901 [2024-07-15 09:20:07.711967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:58.901 [2024-07-15 09:20:07.712014] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.901 [2024-07-15 09:20:07.712035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18eabd0 00:15:58.901 [2024-07-15 09:20:07.712048] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.901 [2024-07-15 09:20:07.713667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.901 [2024-07-15 09:20:07.713695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:58.901 BaseBdev3 00:15:58.901 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:59.159 [2024-07-15 09:20:07.956651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:59.159 [2024-07-15 09:20:07.958026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:59.159 [2024-07-15 09:20:07.958096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:59.159 [2024-07-15 09:20:07.958306] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18ec280 00:15:59.159 [2024-07-15 09:20:07.958318] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:59.159 [2024-07-15 09:20:07.958516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ebe20 00:15:59.159 [2024-07-15 09:20:07.958666] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18ec280 00:15:59.159 [2024-07-15 09:20:07.958676] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18ec280 00:15:59.159 [2024-07-15 09:20:07.958780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.159 09:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:59.417 09:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.417 "name": "raid_bdev1", 00:15:59.417 "uuid": "4dcbe95f-3107-4dad-934d-4517227d1963", 00:15:59.417 "strip_size_kb": 64, 00:15:59.417 "state": "online", 00:15:59.417 "raid_level": "concat", 00:15:59.417 "superblock": true, 00:15:59.417 "num_base_bdevs": 3, 00:15:59.417 "num_base_bdevs_discovered": 3, 00:15:59.417 "num_base_bdevs_operational": 3, 00:15:59.417 "base_bdevs_list": [ 00:15:59.417 { 00:15:59.417 "name": "BaseBdev1", 00:15:59.417 "uuid": "cc23294c-9cfb-5861-a4f9-67bb3250d39a", 00:15:59.417 "is_configured": true, 00:15:59.417 "data_offset": 2048, 00:15:59.417 "data_size": 63488 00:15:59.417 }, 00:15:59.417 { 00:15:59.417 "name": "BaseBdev2", 00:15:59.417 "uuid": "ef4ada3a-d278-52e7-a180-fd18f03926a2", 00:15:59.417 "is_configured": true, 00:15:59.417 "data_offset": 2048, 00:15:59.417 "data_size": 63488 00:15:59.417 }, 00:15:59.417 { 00:15:59.417 "name": "BaseBdev3", 00:15:59.417 "uuid": "12314841-321b-50c7-a2c8-7e8516d2e637", 00:15:59.417 "is_configured": true, 00:15:59.417 "data_offset": 2048, 00:15:59.417 "data_size": 63488 00:15:59.417 } 00:15:59.417 ] 00:15:59.417 }' 00:15:59.417 09:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.417 09:20:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.983 09:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:59.983 09:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:00.240 [2024-07-15 09:20:08.963593] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173a4d0 00:16:01.174 09:20:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.174 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:01.433 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.433 "name": "raid_bdev1", 00:16:01.433 "uuid": "4dcbe95f-3107-4dad-934d-4517227d1963", 00:16:01.433 "strip_size_kb": 64, 00:16:01.433 "state": "online", 00:16:01.433 "raid_level": "concat", 00:16:01.433 "superblock": true, 00:16:01.433 "num_base_bdevs": 3, 00:16:01.433 "num_base_bdevs_discovered": 3, 00:16:01.433 "num_base_bdevs_operational": 3, 00:16:01.433 "base_bdevs_list": [ 00:16:01.433 { 00:16:01.433 "name": "BaseBdev1", 00:16:01.433 "uuid": "cc23294c-9cfb-5861-a4f9-67bb3250d39a", 00:16:01.433 "is_configured": true, 00:16:01.433 "data_offset": 2048, 00:16:01.433 "data_size": 63488 00:16:01.433 }, 00:16:01.433 { 00:16:01.433 "name": "BaseBdev2", 00:16:01.433 "uuid": "ef4ada3a-d278-52e7-a180-fd18f03926a2", 00:16:01.433 "is_configured": true, 00:16:01.433 "data_offset": 2048, 00:16:01.433 "data_size": 63488 00:16:01.433 }, 00:16:01.433 { 00:16:01.433 "name": "BaseBdev3", 00:16:01.433 "uuid": "12314841-321b-50c7-a2c8-7e8516d2e637", 00:16:01.433 "is_configured": true, 00:16:01.433 "data_offset": 2048, 00:16:01.433 "data_size": 63488 00:16:01.433 } 00:16:01.433 ] 00:16:01.433 }' 00:16:01.433 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.433 09:20:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.999 09:20:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:02.258 [2024-07-15 09:20:11.088259] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:02.258 [2024-07-15 09:20:11.088297] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:02.258 [2024-07-15 09:20:11.091466] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:02.258 [2024-07-15 09:20:11.091503] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:02.258 [2024-07-15 09:20:11.091537] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:02.258 [2024-07-15 09:20:11.091549] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ec280 name raid_bdev1, state offline 00:16:02.258 0 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 129985 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 129985 ']' 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 129985 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 129985 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 129985' 00:16:02.258 killing process with pid 129985 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 129985 00:16:02.258 [2024-07-15 09:20:11.159257] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:02.258 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 129985 00:16:02.258 [2024-07-15 09:20:11.180650] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.68CDVyLi8p 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:16:02.517 00:16:02.517 real 0m6.899s 00:16:02.517 user 0m10.953s 00:16:02.517 sys 0m1.205s 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:02.517 09:20:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.517 ************************************ 00:16:02.517 END TEST raid_write_error_test 00:16:02.517 ************************************ 00:16:02.517 09:20:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:02.517 09:20:11 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:02.517 09:20:11 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:02.517 09:20:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:02.517 09:20:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:02.517 09:20:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:02.775 ************************************ 00:16:02.775 START TEST raid_state_function_test 00:16:02.775 ************************************ 00:16:02.775 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:02.775 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=130956 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 130956' 00:16:02.776 Process raid pid: 130956 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 130956 /var/tmp/spdk-raid.sock 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 130956 ']' 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:02.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:02.776 09:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.776 [2024-07-15 09:20:11.576891] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:16:02.776 [2024-07-15 09:20:11.576973] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:02.776 [2024-07-15 09:20:11.709207] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:03.034 [2024-07-15 09:20:11.813953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:03.034 [2024-07-15 09:20:11.874139] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:03.034 [2024-07-15 09:20:11.874165] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:03.599 09:20:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:03.599 09:20:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:03.599 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:03.600 [2024-07-15 09:20:12.552276] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:03.600 [2024-07-15 09:20:12.552323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:03.600 [2024-07-15 09:20:12.552333] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:03.600 [2024-07-15 09:20:12.552345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:03.600 [2024-07-15 09:20:12.552354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:03.600 [2024-07-15 09:20:12.552366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.858 "name": "Existed_Raid", 00:16:03.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.858 "strip_size_kb": 0, 00:16:03.858 "state": "configuring", 00:16:03.858 "raid_level": "raid1", 00:16:03.858 "superblock": false, 00:16:03.858 "num_base_bdevs": 3, 00:16:03.858 "num_base_bdevs_discovered": 0, 00:16:03.858 "num_base_bdevs_operational": 3, 00:16:03.858 "base_bdevs_list": [ 00:16:03.858 { 00:16:03.858 "name": "BaseBdev1", 00:16:03.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.858 "is_configured": false, 00:16:03.858 "data_offset": 0, 00:16:03.858 "data_size": 0 00:16:03.858 }, 00:16:03.858 { 00:16:03.858 "name": "BaseBdev2", 00:16:03.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.858 "is_configured": false, 00:16:03.858 "data_offset": 0, 00:16:03.858 "data_size": 0 00:16:03.858 }, 00:16:03.858 { 00:16:03.858 "name": "BaseBdev3", 00:16:03.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.858 "is_configured": false, 00:16:03.858 "data_offset": 0, 00:16:03.858 "data_size": 0 00:16:03.858 } 00:16:03.858 ] 00:16:03.858 }' 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.858 09:20:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.424 09:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:04.680 [2024-07-15 09:20:13.562801] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:04.680 [2024-07-15 09:20:13.562830] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f5a80 name Existed_Raid, state configuring 00:16:04.681 09:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:04.937 [2024-07-15 09:20:13.803451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:04.937 [2024-07-15 09:20:13.803479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:04.937 [2024-07-15 09:20:13.803489] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:04.937 [2024-07-15 09:20:13.803500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:04.937 [2024-07-15 09:20:13.803509] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:04.937 [2024-07-15 09:20:13.803520] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:04.937 09:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:05.194 [2024-07-15 09:20:14.057964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:05.194 BaseBdev1 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.194 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.452 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:05.710 [ 00:16:05.710 { 00:16:05.710 "name": "BaseBdev1", 00:16:05.710 "aliases": [ 00:16:05.710 "a188989d-5e6a-44f4-a549-bbebe58095fe" 00:16:05.710 ], 00:16:05.710 "product_name": "Malloc disk", 00:16:05.710 "block_size": 512, 00:16:05.710 "num_blocks": 65536, 00:16:05.710 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:05.710 "assigned_rate_limits": { 00:16:05.710 "rw_ios_per_sec": 0, 00:16:05.710 "rw_mbytes_per_sec": 0, 00:16:05.710 "r_mbytes_per_sec": 0, 00:16:05.710 "w_mbytes_per_sec": 0 00:16:05.710 }, 00:16:05.710 "claimed": true, 00:16:05.710 "claim_type": "exclusive_write", 00:16:05.710 "zoned": false, 00:16:05.710 "supported_io_types": { 00:16:05.710 "read": true, 00:16:05.710 "write": true, 00:16:05.710 "unmap": true, 00:16:05.710 "flush": true, 00:16:05.710 "reset": true, 00:16:05.710 "nvme_admin": false, 00:16:05.710 "nvme_io": false, 00:16:05.710 "nvme_io_md": false, 00:16:05.710 "write_zeroes": true, 00:16:05.710 "zcopy": true, 00:16:05.710 "get_zone_info": false, 00:16:05.710 "zone_management": false, 00:16:05.710 "zone_append": false, 00:16:05.710 "compare": false, 00:16:05.710 "compare_and_write": false, 00:16:05.710 "abort": true, 00:16:05.710 "seek_hole": false, 00:16:05.710 "seek_data": false, 00:16:05.710 "copy": true, 00:16:05.710 "nvme_iov_md": false 00:16:05.710 }, 00:16:05.710 "memory_domains": [ 00:16:05.710 { 00:16:05.710 "dma_device_id": "system", 00:16:05.710 "dma_device_type": 1 00:16:05.710 }, 00:16:05.710 { 00:16:05.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.710 "dma_device_type": 2 00:16:05.710 } 00:16:05.710 ], 00:16:05.710 "driver_specific": {} 00:16:05.710 } 00:16:05.710 ] 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.710 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.968 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.968 "name": "Existed_Raid", 00:16:05.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.968 "strip_size_kb": 0, 00:16:05.968 "state": "configuring", 00:16:05.968 "raid_level": "raid1", 00:16:05.968 "superblock": false, 00:16:05.968 "num_base_bdevs": 3, 00:16:05.968 "num_base_bdevs_discovered": 1, 00:16:05.968 "num_base_bdevs_operational": 3, 00:16:05.968 "base_bdevs_list": [ 00:16:05.968 { 00:16:05.968 "name": "BaseBdev1", 00:16:05.968 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:05.968 "is_configured": true, 00:16:05.968 "data_offset": 0, 00:16:05.968 "data_size": 65536 00:16:05.968 }, 00:16:05.968 { 00:16:05.968 "name": "BaseBdev2", 00:16:05.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.968 "is_configured": false, 00:16:05.968 "data_offset": 0, 00:16:05.968 "data_size": 0 00:16:05.968 }, 00:16:05.968 { 00:16:05.968 "name": "BaseBdev3", 00:16:05.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.968 "is_configured": false, 00:16:05.968 "data_offset": 0, 00:16:05.968 "data_size": 0 00:16:05.968 } 00:16:05.968 ] 00:16:05.968 }' 00:16:05.968 09:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.968 09:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.567 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:06.825 [2024-07-15 09:20:15.634144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:06.825 [2024-07-15 09:20:15.634185] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f5310 name Existed_Raid, state configuring 00:16:06.825 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:07.083 [2024-07-15 09:20:15.874813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:07.083 [2024-07-15 09:20:15.876287] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:07.083 [2024-07-15 09:20:15.876322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:07.083 [2024-07-15 09:20:15.876332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:07.083 [2024-07-15 09:20:15.876344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.083 09:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.341 09:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.341 "name": "Existed_Raid", 00:16:07.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.341 "strip_size_kb": 0, 00:16:07.341 "state": "configuring", 00:16:07.341 "raid_level": "raid1", 00:16:07.341 "superblock": false, 00:16:07.341 "num_base_bdevs": 3, 00:16:07.341 "num_base_bdevs_discovered": 1, 00:16:07.341 "num_base_bdevs_operational": 3, 00:16:07.341 "base_bdevs_list": [ 00:16:07.341 { 00:16:07.341 "name": "BaseBdev1", 00:16:07.341 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:07.341 "is_configured": true, 00:16:07.341 "data_offset": 0, 00:16:07.341 "data_size": 65536 00:16:07.341 }, 00:16:07.341 { 00:16:07.341 "name": "BaseBdev2", 00:16:07.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.341 "is_configured": false, 00:16:07.341 "data_offset": 0, 00:16:07.341 "data_size": 0 00:16:07.341 }, 00:16:07.341 { 00:16:07.341 "name": "BaseBdev3", 00:16:07.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.341 "is_configured": false, 00:16:07.341 "data_offset": 0, 00:16:07.341 "data_size": 0 00:16:07.341 } 00:16:07.341 ] 00:16:07.341 }' 00:16:07.341 09:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.341 09:20:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.906 09:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:08.164 [2024-07-15 09:20:16.989252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:08.164 BaseBdev2 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:08.164 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.423 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:08.681 [ 00:16:08.681 { 00:16:08.681 "name": "BaseBdev2", 00:16:08.681 "aliases": [ 00:16:08.681 "3d3f0ebc-86c0-417c-be64-00ba831b83fc" 00:16:08.681 ], 00:16:08.681 "product_name": "Malloc disk", 00:16:08.681 "block_size": 512, 00:16:08.681 "num_blocks": 65536, 00:16:08.681 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:08.681 "assigned_rate_limits": { 00:16:08.681 "rw_ios_per_sec": 0, 00:16:08.681 "rw_mbytes_per_sec": 0, 00:16:08.681 "r_mbytes_per_sec": 0, 00:16:08.681 "w_mbytes_per_sec": 0 00:16:08.681 }, 00:16:08.681 "claimed": true, 00:16:08.681 "claim_type": "exclusive_write", 00:16:08.681 "zoned": false, 00:16:08.681 "supported_io_types": { 00:16:08.681 "read": true, 00:16:08.681 "write": true, 00:16:08.681 "unmap": true, 00:16:08.681 "flush": true, 00:16:08.681 "reset": true, 00:16:08.681 "nvme_admin": false, 00:16:08.681 "nvme_io": false, 00:16:08.681 "nvme_io_md": false, 00:16:08.681 "write_zeroes": true, 00:16:08.681 "zcopy": true, 00:16:08.681 "get_zone_info": false, 00:16:08.681 "zone_management": false, 00:16:08.681 "zone_append": false, 00:16:08.681 "compare": false, 00:16:08.681 "compare_and_write": false, 00:16:08.681 "abort": true, 00:16:08.681 "seek_hole": false, 00:16:08.681 "seek_data": false, 00:16:08.681 "copy": true, 00:16:08.681 "nvme_iov_md": false 00:16:08.681 }, 00:16:08.681 "memory_domains": [ 00:16:08.681 { 00:16:08.681 "dma_device_id": "system", 00:16:08.681 "dma_device_type": 1 00:16:08.681 }, 00:16:08.681 { 00:16:08.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.681 "dma_device_type": 2 00:16:08.681 } 00:16:08.681 ], 00:16:08.681 "driver_specific": {} 00:16:08.681 } 00:16:08.681 ] 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.681 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.938 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.938 "name": "Existed_Raid", 00:16:08.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.938 "strip_size_kb": 0, 00:16:08.938 "state": "configuring", 00:16:08.938 "raid_level": "raid1", 00:16:08.938 "superblock": false, 00:16:08.938 "num_base_bdevs": 3, 00:16:08.938 "num_base_bdevs_discovered": 2, 00:16:08.938 "num_base_bdevs_operational": 3, 00:16:08.938 "base_bdevs_list": [ 00:16:08.938 { 00:16:08.938 "name": "BaseBdev1", 00:16:08.938 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:08.938 "is_configured": true, 00:16:08.938 "data_offset": 0, 00:16:08.938 "data_size": 65536 00:16:08.938 }, 00:16:08.938 { 00:16:08.938 "name": "BaseBdev2", 00:16:08.938 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:08.938 "is_configured": true, 00:16:08.938 "data_offset": 0, 00:16:08.938 "data_size": 65536 00:16:08.938 }, 00:16:08.938 { 00:16:08.938 "name": "BaseBdev3", 00:16:08.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.938 "is_configured": false, 00:16:08.938 "data_offset": 0, 00:16:08.938 "data_size": 0 00:16:08.938 } 00:16:08.938 ] 00:16:08.938 }' 00:16:08.938 09:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.938 09:20:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.505 09:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:09.762 [2024-07-15 09:20:18.564892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:09.762 [2024-07-15 09:20:18.564939] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11f6400 00:16:09.762 [2024-07-15 09:20:18.564948] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:09.762 [2024-07-15 09:20:18.565201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f5ef0 00:16:09.762 [2024-07-15 09:20:18.565325] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11f6400 00:16:09.762 [2024-07-15 09:20:18.565336] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11f6400 00:16:09.762 [2024-07-15 09:20:18.565500] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.762 BaseBdev3 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:09.762 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.020 09:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:10.278 [ 00:16:10.278 { 00:16:10.278 "name": "BaseBdev3", 00:16:10.278 "aliases": [ 00:16:10.278 "70764c60-a51c-43c3-97a9-557666154ead" 00:16:10.278 ], 00:16:10.278 "product_name": "Malloc disk", 00:16:10.278 "block_size": 512, 00:16:10.278 "num_blocks": 65536, 00:16:10.278 "uuid": "70764c60-a51c-43c3-97a9-557666154ead", 00:16:10.278 "assigned_rate_limits": { 00:16:10.278 "rw_ios_per_sec": 0, 00:16:10.278 "rw_mbytes_per_sec": 0, 00:16:10.278 "r_mbytes_per_sec": 0, 00:16:10.278 "w_mbytes_per_sec": 0 00:16:10.278 }, 00:16:10.278 "claimed": true, 00:16:10.278 "claim_type": "exclusive_write", 00:16:10.278 "zoned": false, 00:16:10.278 "supported_io_types": { 00:16:10.278 "read": true, 00:16:10.278 "write": true, 00:16:10.278 "unmap": true, 00:16:10.278 "flush": true, 00:16:10.278 "reset": true, 00:16:10.278 "nvme_admin": false, 00:16:10.278 "nvme_io": false, 00:16:10.278 "nvme_io_md": false, 00:16:10.278 "write_zeroes": true, 00:16:10.278 "zcopy": true, 00:16:10.278 "get_zone_info": false, 00:16:10.278 "zone_management": false, 00:16:10.278 "zone_append": false, 00:16:10.278 "compare": false, 00:16:10.278 "compare_and_write": false, 00:16:10.278 "abort": true, 00:16:10.278 "seek_hole": false, 00:16:10.278 "seek_data": false, 00:16:10.278 "copy": true, 00:16:10.278 "nvme_iov_md": false 00:16:10.278 }, 00:16:10.278 "memory_domains": [ 00:16:10.278 { 00:16:10.278 "dma_device_id": "system", 00:16:10.278 "dma_device_type": 1 00:16:10.278 }, 00:16:10.278 { 00:16:10.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.278 "dma_device_type": 2 00:16:10.278 } 00:16:10.278 ], 00:16:10.278 "driver_specific": {} 00:16:10.278 } 00:16:10.278 ] 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.278 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.536 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.536 "name": "Existed_Raid", 00:16:10.536 "uuid": "5621bb8c-9cc9-442a-aa3c-edb3221a0155", 00:16:10.536 "strip_size_kb": 0, 00:16:10.536 "state": "online", 00:16:10.536 "raid_level": "raid1", 00:16:10.536 "superblock": false, 00:16:10.536 "num_base_bdevs": 3, 00:16:10.536 "num_base_bdevs_discovered": 3, 00:16:10.536 "num_base_bdevs_operational": 3, 00:16:10.536 "base_bdevs_list": [ 00:16:10.536 { 00:16:10.536 "name": "BaseBdev1", 00:16:10.536 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:10.536 "is_configured": true, 00:16:10.536 "data_offset": 0, 00:16:10.536 "data_size": 65536 00:16:10.536 }, 00:16:10.536 { 00:16:10.536 "name": "BaseBdev2", 00:16:10.536 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:10.536 "is_configured": true, 00:16:10.536 "data_offset": 0, 00:16:10.536 "data_size": 65536 00:16:10.536 }, 00:16:10.536 { 00:16:10.536 "name": "BaseBdev3", 00:16:10.536 "uuid": "70764c60-a51c-43c3-97a9-557666154ead", 00:16:10.537 "is_configured": true, 00:16:10.537 "data_offset": 0, 00:16:10.537 "data_size": 65536 00:16:10.537 } 00:16:10.537 ] 00:16:10.537 }' 00:16:10.537 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.537 09:20:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:11.103 09:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:11.103 [2024-07-15 09:20:19.980944] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:11.103 "name": "Existed_Raid", 00:16:11.103 "aliases": [ 00:16:11.103 "5621bb8c-9cc9-442a-aa3c-edb3221a0155" 00:16:11.103 ], 00:16:11.103 "product_name": "Raid Volume", 00:16:11.103 "block_size": 512, 00:16:11.103 "num_blocks": 65536, 00:16:11.103 "uuid": "5621bb8c-9cc9-442a-aa3c-edb3221a0155", 00:16:11.103 "assigned_rate_limits": { 00:16:11.103 "rw_ios_per_sec": 0, 00:16:11.103 "rw_mbytes_per_sec": 0, 00:16:11.103 "r_mbytes_per_sec": 0, 00:16:11.103 "w_mbytes_per_sec": 0 00:16:11.103 }, 00:16:11.103 "claimed": false, 00:16:11.103 "zoned": false, 00:16:11.103 "supported_io_types": { 00:16:11.103 "read": true, 00:16:11.103 "write": true, 00:16:11.103 "unmap": false, 00:16:11.103 "flush": false, 00:16:11.103 "reset": true, 00:16:11.103 "nvme_admin": false, 00:16:11.103 "nvme_io": false, 00:16:11.103 "nvme_io_md": false, 00:16:11.103 "write_zeroes": true, 00:16:11.103 "zcopy": false, 00:16:11.103 "get_zone_info": false, 00:16:11.103 "zone_management": false, 00:16:11.103 "zone_append": false, 00:16:11.103 "compare": false, 00:16:11.103 "compare_and_write": false, 00:16:11.103 "abort": false, 00:16:11.103 "seek_hole": false, 00:16:11.103 "seek_data": false, 00:16:11.103 "copy": false, 00:16:11.103 "nvme_iov_md": false 00:16:11.103 }, 00:16:11.103 "memory_domains": [ 00:16:11.103 { 00:16:11.103 "dma_device_id": "system", 00:16:11.103 "dma_device_type": 1 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.103 "dma_device_type": 2 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "dma_device_id": "system", 00:16:11.103 "dma_device_type": 1 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.103 "dma_device_type": 2 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "dma_device_id": "system", 00:16:11.103 "dma_device_type": 1 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.103 "dma_device_type": 2 00:16:11.103 } 00:16:11.103 ], 00:16:11.103 "driver_specific": { 00:16:11.103 "raid": { 00:16:11.103 "uuid": "5621bb8c-9cc9-442a-aa3c-edb3221a0155", 00:16:11.103 "strip_size_kb": 0, 00:16:11.103 "state": "online", 00:16:11.103 "raid_level": "raid1", 00:16:11.103 "superblock": false, 00:16:11.103 "num_base_bdevs": 3, 00:16:11.103 "num_base_bdevs_discovered": 3, 00:16:11.103 "num_base_bdevs_operational": 3, 00:16:11.103 "base_bdevs_list": [ 00:16:11.103 { 00:16:11.103 "name": "BaseBdev1", 00:16:11.103 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:11.103 "is_configured": true, 00:16:11.103 "data_offset": 0, 00:16:11.103 "data_size": 65536 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "name": "BaseBdev2", 00:16:11.103 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:11.103 "is_configured": true, 00:16:11.103 "data_offset": 0, 00:16:11.103 "data_size": 65536 00:16:11.103 }, 00:16:11.103 { 00:16:11.103 "name": "BaseBdev3", 00:16:11.103 "uuid": "70764c60-a51c-43c3-97a9-557666154ead", 00:16:11.103 "is_configured": true, 00:16:11.103 "data_offset": 0, 00:16:11.103 "data_size": 65536 00:16:11.103 } 00:16:11.103 ] 00:16:11.103 } 00:16:11.103 } 00:16:11.103 }' 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:11.103 BaseBdev2 00:16:11.103 BaseBdev3' 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:11.103 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.362 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.362 "name": "BaseBdev1", 00:16:11.362 "aliases": [ 00:16:11.362 "a188989d-5e6a-44f4-a549-bbebe58095fe" 00:16:11.362 ], 00:16:11.362 "product_name": "Malloc disk", 00:16:11.362 "block_size": 512, 00:16:11.362 "num_blocks": 65536, 00:16:11.362 "uuid": "a188989d-5e6a-44f4-a549-bbebe58095fe", 00:16:11.362 "assigned_rate_limits": { 00:16:11.362 "rw_ios_per_sec": 0, 00:16:11.362 "rw_mbytes_per_sec": 0, 00:16:11.362 "r_mbytes_per_sec": 0, 00:16:11.362 "w_mbytes_per_sec": 0 00:16:11.362 }, 00:16:11.362 "claimed": true, 00:16:11.362 "claim_type": "exclusive_write", 00:16:11.362 "zoned": false, 00:16:11.362 "supported_io_types": { 00:16:11.362 "read": true, 00:16:11.362 "write": true, 00:16:11.362 "unmap": true, 00:16:11.362 "flush": true, 00:16:11.362 "reset": true, 00:16:11.362 "nvme_admin": false, 00:16:11.362 "nvme_io": false, 00:16:11.362 "nvme_io_md": false, 00:16:11.362 "write_zeroes": true, 00:16:11.362 "zcopy": true, 00:16:11.362 "get_zone_info": false, 00:16:11.362 "zone_management": false, 00:16:11.362 "zone_append": false, 00:16:11.362 "compare": false, 00:16:11.362 "compare_and_write": false, 00:16:11.362 "abort": true, 00:16:11.362 "seek_hole": false, 00:16:11.362 "seek_data": false, 00:16:11.362 "copy": true, 00:16:11.362 "nvme_iov_md": false 00:16:11.362 }, 00:16:11.362 "memory_domains": [ 00:16:11.362 { 00:16:11.362 "dma_device_id": "system", 00:16:11.362 "dma_device_type": 1 00:16:11.362 }, 00:16:11.362 { 00:16:11.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.362 "dma_device_type": 2 00:16:11.362 } 00:16:11.362 ], 00:16:11.362 "driver_specific": {} 00:16:11.362 }' 00:16:11.362 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.362 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.362 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.620 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.878 "name": "BaseBdev2", 00:16:11.878 "aliases": [ 00:16:11.878 "3d3f0ebc-86c0-417c-be64-00ba831b83fc" 00:16:11.878 ], 00:16:11.878 "product_name": "Malloc disk", 00:16:11.878 "block_size": 512, 00:16:11.878 "num_blocks": 65536, 00:16:11.878 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:11.878 "assigned_rate_limits": { 00:16:11.878 "rw_ios_per_sec": 0, 00:16:11.878 "rw_mbytes_per_sec": 0, 00:16:11.878 "r_mbytes_per_sec": 0, 00:16:11.878 "w_mbytes_per_sec": 0 00:16:11.878 }, 00:16:11.878 "claimed": true, 00:16:11.878 "claim_type": "exclusive_write", 00:16:11.878 "zoned": false, 00:16:11.878 "supported_io_types": { 00:16:11.878 "read": true, 00:16:11.878 "write": true, 00:16:11.878 "unmap": true, 00:16:11.878 "flush": true, 00:16:11.878 "reset": true, 00:16:11.878 "nvme_admin": false, 00:16:11.878 "nvme_io": false, 00:16:11.878 "nvme_io_md": false, 00:16:11.878 "write_zeroes": true, 00:16:11.878 "zcopy": true, 00:16:11.878 "get_zone_info": false, 00:16:11.878 "zone_management": false, 00:16:11.878 "zone_append": false, 00:16:11.878 "compare": false, 00:16:11.878 "compare_and_write": false, 00:16:11.878 "abort": true, 00:16:11.878 "seek_hole": false, 00:16:11.878 "seek_data": false, 00:16:11.878 "copy": true, 00:16:11.878 "nvme_iov_md": false 00:16:11.878 }, 00:16:11.878 "memory_domains": [ 00:16:11.878 { 00:16:11.878 "dma_device_id": "system", 00:16:11.878 "dma_device_type": 1 00:16:11.878 }, 00:16:11.878 { 00:16:11.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.878 "dma_device_type": 2 00:16:11.878 } 00:16:11.878 ], 00:16:11.878 "driver_specific": {} 00:16:11.878 }' 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.878 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.136 09:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.136 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.136 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.136 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.394 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.394 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:12.394 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:12.394 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.652 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.652 "name": "BaseBdev3", 00:16:12.652 "aliases": [ 00:16:12.652 "70764c60-a51c-43c3-97a9-557666154ead" 00:16:12.652 ], 00:16:12.652 "product_name": "Malloc disk", 00:16:12.652 "block_size": 512, 00:16:12.652 "num_blocks": 65536, 00:16:12.652 "uuid": "70764c60-a51c-43c3-97a9-557666154ead", 00:16:12.652 "assigned_rate_limits": { 00:16:12.652 "rw_ios_per_sec": 0, 00:16:12.652 "rw_mbytes_per_sec": 0, 00:16:12.652 "r_mbytes_per_sec": 0, 00:16:12.652 "w_mbytes_per_sec": 0 00:16:12.652 }, 00:16:12.652 "claimed": true, 00:16:12.652 "claim_type": "exclusive_write", 00:16:12.652 "zoned": false, 00:16:12.653 "supported_io_types": { 00:16:12.653 "read": true, 00:16:12.653 "write": true, 00:16:12.653 "unmap": true, 00:16:12.653 "flush": true, 00:16:12.653 "reset": true, 00:16:12.653 "nvme_admin": false, 00:16:12.653 "nvme_io": false, 00:16:12.653 "nvme_io_md": false, 00:16:12.653 "write_zeroes": true, 00:16:12.653 "zcopy": true, 00:16:12.653 "get_zone_info": false, 00:16:12.653 "zone_management": false, 00:16:12.653 "zone_append": false, 00:16:12.653 "compare": false, 00:16:12.653 "compare_and_write": false, 00:16:12.653 "abort": true, 00:16:12.653 "seek_hole": false, 00:16:12.653 "seek_data": false, 00:16:12.653 "copy": true, 00:16:12.653 "nvme_iov_md": false 00:16:12.653 }, 00:16:12.653 "memory_domains": [ 00:16:12.653 { 00:16:12.653 "dma_device_id": "system", 00:16:12.653 "dma_device_type": 1 00:16:12.653 }, 00:16:12.653 { 00:16:12.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.653 "dma_device_type": 2 00:16:12.653 } 00:16:12.653 ], 00:16:12.653 "driver_specific": {} 00:16:12.653 }' 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.653 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.911 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.911 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.911 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.911 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.911 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:13.169 [2024-07-15 09:20:21.937908] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.169 09:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.428 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.428 "name": "Existed_Raid", 00:16:13.428 "uuid": "5621bb8c-9cc9-442a-aa3c-edb3221a0155", 00:16:13.428 "strip_size_kb": 0, 00:16:13.428 "state": "online", 00:16:13.428 "raid_level": "raid1", 00:16:13.428 "superblock": false, 00:16:13.428 "num_base_bdevs": 3, 00:16:13.428 "num_base_bdevs_discovered": 2, 00:16:13.428 "num_base_bdevs_operational": 2, 00:16:13.428 "base_bdevs_list": [ 00:16:13.428 { 00:16:13.428 "name": null, 00:16:13.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.428 "is_configured": false, 00:16:13.428 "data_offset": 0, 00:16:13.428 "data_size": 65536 00:16:13.428 }, 00:16:13.428 { 00:16:13.428 "name": "BaseBdev2", 00:16:13.428 "uuid": "3d3f0ebc-86c0-417c-be64-00ba831b83fc", 00:16:13.428 "is_configured": true, 00:16:13.428 "data_offset": 0, 00:16:13.428 "data_size": 65536 00:16:13.428 }, 00:16:13.428 { 00:16:13.428 "name": "BaseBdev3", 00:16:13.428 "uuid": "70764c60-a51c-43c3-97a9-557666154ead", 00:16:13.428 "is_configured": true, 00:16:13.428 "data_offset": 0, 00:16:13.428 "data_size": 65536 00:16:13.428 } 00:16:13.428 ] 00:16:13.428 }' 00:16:13.428 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.428 09:20:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.993 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:13.993 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.993 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.993 09:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:14.250 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:14.250 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:14.250 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:14.506 [2024-07-15 09:20:23.287401] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:14.506 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:14.506 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:14.506 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.506 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:14.763 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:14.763 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:14.763 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:15.020 [2024-07-15 09:20:23.793148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:15.020 [2024-07-15 09:20:23.793241] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:15.020 [2024-07-15 09:20:23.805962] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:15.020 [2024-07-15 09:20:23.805998] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:15.020 [2024-07-15 09:20:23.806011] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f6400 name Existed_Raid, state offline 00:16:15.020 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:15.020 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:15.020 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.020 09:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:15.276 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:15.533 BaseBdev2 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.533 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.790 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:16.048 [ 00:16:16.048 { 00:16:16.048 "name": "BaseBdev2", 00:16:16.048 "aliases": [ 00:16:16.048 "2ec15916-c76d-48ac-bffa-b96503913406" 00:16:16.048 ], 00:16:16.048 "product_name": "Malloc disk", 00:16:16.048 "block_size": 512, 00:16:16.048 "num_blocks": 65536, 00:16:16.048 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:16.048 "assigned_rate_limits": { 00:16:16.048 "rw_ios_per_sec": 0, 00:16:16.048 "rw_mbytes_per_sec": 0, 00:16:16.048 "r_mbytes_per_sec": 0, 00:16:16.048 "w_mbytes_per_sec": 0 00:16:16.048 }, 00:16:16.048 "claimed": false, 00:16:16.048 "zoned": false, 00:16:16.048 "supported_io_types": { 00:16:16.048 "read": true, 00:16:16.048 "write": true, 00:16:16.048 "unmap": true, 00:16:16.048 "flush": true, 00:16:16.048 "reset": true, 00:16:16.048 "nvme_admin": false, 00:16:16.048 "nvme_io": false, 00:16:16.048 "nvme_io_md": false, 00:16:16.048 "write_zeroes": true, 00:16:16.048 "zcopy": true, 00:16:16.048 "get_zone_info": false, 00:16:16.048 "zone_management": false, 00:16:16.048 "zone_append": false, 00:16:16.048 "compare": false, 00:16:16.048 "compare_and_write": false, 00:16:16.048 "abort": true, 00:16:16.048 "seek_hole": false, 00:16:16.048 "seek_data": false, 00:16:16.048 "copy": true, 00:16:16.048 "nvme_iov_md": false 00:16:16.048 }, 00:16:16.048 "memory_domains": [ 00:16:16.048 { 00:16:16.048 "dma_device_id": "system", 00:16:16.048 "dma_device_type": 1 00:16:16.048 }, 00:16:16.048 { 00:16:16.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.048 "dma_device_type": 2 00:16:16.048 } 00:16:16.048 ], 00:16:16.048 "driver_specific": {} 00:16:16.048 } 00:16:16.048 ] 00:16:16.048 09:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:16.048 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:16.048 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:16.048 09:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:16.305 BaseBdev3 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:16.305 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:16.562 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:16.562 [ 00:16:16.562 { 00:16:16.562 "name": "BaseBdev3", 00:16:16.562 "aliases": [ 00:16:16.562 "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca" 00:16:16.562 ], 00:16:16.562 "product_name": "Malloc disk", 00:16:16.562 "block_size": 512, 00:16:16.562 "num_blocks": 65536, 00:16:16.562 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:16.562 "assigned_rate_limits": { 00:16:16.562 "rw_ios_per_sec": 0, 00:16:16.562 "rw_mbytes_per_sec": 0, 00:16:16.562 "r_mbytes_per_sec": 0, 00:16:16.562 "w_mbytes_per_sec": 0 00:16:16.562 }, 00:16:16.562 "claimed": false, 00:16:16.562 "zoned": false, 00:16:16.562 "supported_io_types": { 00:16:16.562 "read": true, 00:16:16.562 "write": true, 00:16:16.562 "unmap": true, 00:16:16.562 "flush": true, 00:16:16.562 "reset": true, 00:16:16.562 "nvme_admin": false, 00:16:16.562 "nvme_io": false, 00:16:16.562 "nvme_io_md": false, 00:16:16.562 "write_zeroes": true, 00:16:16.562 "zcopy": true, 00:16:16.562 "get_zone_info": false, 00:16:16.562 "zone_management": false, 00:16:16.562 "zone_append": false, 00:16:16.562 "compare": false, 00:16:16.562 "compare_and_write": false, 00:16:16.562 "abort": true, 00:16:16.562 "seek_hole": false, 00:16:16.562 "seek_data": false, 00:16:16.562 "copy": true, 00:16:16.562 "nvme_iov_md": false 00:16:16.562 }, 00:16:16.562 "memory_domains": [ 00:16:16.562 { 00:16:16.562 "dma_device_id": "system", 00:16:16.562 "dma_device_type": 1 00:16:16.562 }, 00:16:16.562 { 00:16:16.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.562 "dma_device_type": 2 00:16:16.562 } 00:16:16.562 ], 00:16:16.562 "driver_specific": {} 00:16:16.562 } 00:16:16.562 ] 00:16:16.819 09:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:16.819 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:16.819 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:16.819 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:16.819 [2024-07-15 09:20:25.752837] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:16.819 [2024-07-15 09:20:25.752881] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:16.819 [2024-07-15 09:20:25.752902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:16.819 [2024-07-15 09:20:25.754227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:16.819 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.076 09:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.076 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.076 "name": "Existed_Raid", 00:16:17.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.076 "strip_size_kb": 0, 00:16:17.076 "state": "configuring", 00:16:17.076 "raid_level": "raid1", 00:16:17.076 "superblock": false, 00:16:17.076 "num_base_bdevs": 3, 00:16:17.076 "num_base_bdevs_discovered": 2, 00:16:17.076 "num_base_bdevs_operational": 3, 00:16:17.076 "base_bdevs_list": [ 00:16:17.076 { 00:16:17.076 "name": "BaseBdev1", 00:16:17.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.076 "is_configured": false, 00:16:17.076 "data_offset": 0, 00:16:17.076 "data_size": 0 00:16:17.076 }, 00:16:17.076 { 00:16:17.076 "name": "BaseBdev2", 00:16:17.076 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:17.076 "is_configured": true, 00:16:17.076 "data_offset": 0, 00:16:17.076 "data_size": 65536 00:16:17.076 }, 00:16:17.076 { 00:16:17.076 "name": "BaseBdev3", 00:16:17.076 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:17.076 "is_configured": true, 00:16:17.076 "data_offset": 0, 00:16:17.076 "data_size": 65536 00:16:17.076 } 00:16:17.076 ] 00:16:17.076 }' 00:16:17.076 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.076 09:20:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.640 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:17.897 [2024-07-15 09:20:26.815632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.897 09:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.153 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.153 "name": "Existed_Raid", 00:16:18.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.153 "strip_size_kb": 0, 00:16:18.153 "state": "configuring", 00:16:18.153 "raid_level": "raid1", 00:16:18.153 "superblock": false, 00:16:18.153 "num_base_bdevs": 3, 00:16:18.153 "num_base_bdevs_discovered": 1, 00:16:18.153 "num_base_bdevs_operational": 3, 00:16:18.153 "base_bdevs_list": [ 00:16:18.153 { 00:16:18.153 "name": "BaseBdev1", 00:16:18.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.153 "is_configured": false, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 0 00:16:18.153 }, 00:16:18.153 { 00:16:18.153 "name": null, 00:16:18.153 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:18.153 "is_configured": false, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 65536 00:16:18.153 }, 00:16:18.153 { 00:16:18.153 "name": "BaseBdev3", 00:16:18.153 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:18.153 "is_configured": true, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 65536 00:16:18.153 } 00:16:18.153 ] 00:16:18.153 }' 00:16:18.153 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.153 09:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.086 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.086 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:19.086 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:19.086 09:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:19.344 [2024-07-15 09:20:28.155733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:19.344 BaseBdev1 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:19.344 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:19.602 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:19.859 [ 00:16:19.859 { 00:16:19.859 "name": "BaseBdev1", 00:16:19.859 "aliases": [ 00:16:19.859 "0c94ef58-71dc-46fe-823a-d23d17642c6c" 00:16:19.859 ], 00:16:19.859 "product_name": "Malloc disk", 00:16:19.859 "block_size": 512, 00:16:19.859 "num_blocks": 65536, 00:16:19.859 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:19.859 "assigned_rate_limits": { 00:16:19.859 "rw_ios_per_sec": 0, 00:16:19.859 "rw_mbytes_per_sec": 0, 00:16:19.859 "r_mbytes_per_sec": 0, 00:16:19.859 "w_mbytes_per_sec": 0 00:16:19.859 }, 00:16:19.859 "claimed": true, 00:16:19.859 "claim_type": "exclusive_write", 00:16:19.859 "zoned": false, 00:16:19.859 "supported_io_types": { 00:16:19.859 "read": true, 00:16:19.859 "write": true, 00:16:19.859 "unmap": true, 00:16:19.859 "flush": true, 00:16:19.859 "reset": true, 00:16:19.859 "nvme_admin": false, 00:16:19.859 "nvme_io": false, 00:16:19.859 "nvme_io_md": false, 00:16:19.859 "write_zeroes": true, 00:16:19.859 "zcopy": true, 00:16:19.859 "get_zone_info": false, 00:16:19.859 "zone_management": false, 00:16:19.859 "zone_append": false, 00:16:19.859 "compare": false, 00:16:19.859 "compare_and_write": false, 00:16:19.859 "abort": true, 00:16:19.859 "seek_hole": false, 00:16:19.859 "seek_data": false, 00:16:19.859 "copy": true, 00:16:19.859 "nvme_iov_md": false 00:16:19.859 }, 00:16:19.859 "memory_domains": [ 00:16:19.859 { 00:16:19.859 "dma_device_id": "system", 00:16:19.859 "dma_device_type": 1 00:16:19.859 }, 00:16:19.859 { 00:16:19.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.859 "dma_device_type": 2 00:16:19.859 } 00:16:19.859 ], 00:16:19.859 "driver_specific": {} 00:16:19.859 } 00:16:19.859 ] 00:16:19.859 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.860 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.117 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.117 "name": "Existed_Raid", 00:16:20.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.117 "strip_size_kb": 0, 00:16:20.117 "state": "configuring", 00:16:20.117 "raid_level": "raid1", 00:16:20.117 "superblock": false, 00:16:20.117 "num_base_bdevs": 3, 00:16:20.117 "num_base_bdevs_discovered": 2, 00:16:20.117 "num_base_bdevs_operational": 3, 00:16:20.117 "base_bdevs_list": [ 00:16:20.117 { 00:16:20.117 "name": "BaseBdev1", 00:16:20.117 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:20.117 "is_configured": true, 00:16:20.117 "data_offset": 0, 00:16:20.117 "data_size": 65536 00:16:20.117 }, 00:16:20.117 { 00:16:20.117 "name": null, 00:16:20.117 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:20.117 "is_configured": false, 00:16:20.117 "data_offset": 0, 00:16:20.117 "data_size": 65536 00:16:20.117 }, 00:16:20.117 { 00:16:20.117 "name": "BaseBdev3", 00:16:20.117 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:20.117 "is_configured": true, 00:16:20.117 "data_offset": 0, 00:16:20.117 "data_size": 65536 00:16:20.117 } 00:16:20.117 ] 00:16:20.117 }' 00:16:20.117 09:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.117 09:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.716 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.717 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:20.975 [2024-07-15 09:20:29.892365] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.975 09:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.233 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.233 "name": "Existed_Raid", 00:16:21.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.233 "strip_size_kb": 0, 00:16:21.233 "state": "configuring", 00:16:21.233 "raid_level": "raid1", 00:16:21.233 "superblock": false, 00:16:21.233 "num_base_bdevs": 3, 00:16:21.233 "num_base_bdevs_discovered": 1, 00:16:21.233 "num_base_bdevs_operational": 3, 00:16:21.233 "base_bdevs_list": [ 00:16:21.233 { 00:16:21.233 "name": "BaseBdev1", 00:16:21.233 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:21.233 "is_configured": true, 00:16:21.233 "data_offset": 0, 00:16:21.233 "data_size": 65536 00:16:21.233 }, 00:16:21.233 { 00:16:21.233 "name": null, 00:16:21.233 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:21.233 "is_configured": false, 00:16:21.233 "data_offset": 0, 00:16:21.233 "data_size": 65536 00:16:21.233 }, 00:16:21.233 { 00:16:21.233 "name": null, 00:16:21.233 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:21.233 "is_configured": false, 00:16:21.233 "data_offset": 0, 00:16:21.233 "data_size": 65536 00:16:21.233 } 00:16:21.233 ] 00:16:21.233 }' 00:16:21.233 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.233 09:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.797 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.797 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:22.055 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:22.055 09:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:22.313 [2024-07-15 09:20:31.131670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.313 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.570 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.570 "name": "Existed_Raid", 00:16:22.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.570 "strip_size_kb": 0, 00:16:22.570 "state": "configuring", 00:16:22.570 "raid_level": "raid1", 00:16:22.570 "superblock": false, 00:16:22.570 "num_base_bdevs": 3, 00:16:22.570 "num_base_bdevs_discovered": 2, 00:16:22.570 "num_base_bdevs_operational": 3, 00:16:22.570 "base_bdevs_list": [ 00:16:22.570 { 00:16:22.570 "name": "BaseBdev1", 00:16:22.570 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:22.570 "is_configured": true, 00:16:22.570 "data_offset": 0, 00:16:22.570 "data_size": 65536 00:16:22.570 }, 00:16:22.570 { 00:16:22.570 "name": null, 00:16:22.570 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:22.570 "is_configured": false, 00:16:22.570 "data_offset": 0, 00:16:22.570 "data_size": 65536 00:16:22.570 }, 00:16:22.570 { 00:16:22.570 "name": "BaseBdev3", 00:16:22.570 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:22.570 "is_configured": true, 00:16:22.570 "data_offset": 0, 00:16:22.570 "data_size": 65536 00:16:22.570 } 00:16:22.570 ] 00:16:22.570 }' 00:16:22.570 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.570 09:20:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.151 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.151 09:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:23.408 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:23.408 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:23.667 [2024-07-15 09:20:32.415096] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.667 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.925 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.925 "name": "Existed_Raid", 00:16:23.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.925 "strip_size_kb": 0, 00:16:23.925 "state": "configuring", 00:16:23.925 "raid_level": "raid1", 00:16:23.925 "superblock": false, 00:16:23.925 "num_base_bdevs": 3, 00:16:23.925 "num_base_bdevs_discovered": 1, 00:16:23.925 "num_base_bdevs_operational": 3, 00:16:23.925 "base_bdevs_list": [ 00:16:23.925 { 00:16:23.926 "name": null, 00:16:23.926 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:23.926 "is_configured": false, 00:16:23.926 "data_offset": 0, 00:16:23.926 "data_size": 65536 00:16:23.926 }, 00:16:23.926 { 00:16:23.926 "name": null, 00:16:23.926 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:23.926 "is_configured": false, 00:16:23.926 "data_offset": 0, 00:16:23.926 "data_size": 65536 00:16:23.926 }, 00:16:23.926 { 00:16:23.926 "name": "BaseBdev3", 00:16:23.926 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:23.926 "is_configured": true, 00:16:23.926 "data_offset": 0, 00:16:23.926 "data_size": 65536 00:16:23.926 } 00:16:23.926 ] 00:16:23.926 }' 00:16:23.926 09:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.926 09:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.491 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.491 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:24.747 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:24.747 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:25.005 [2024-07-15 09:20:33.793220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.005 09:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.263 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.263 "name": "Existed_Raid", 00:16:25.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.263 "strip_size_kb": 0, 00:16:25.263 "state": "configuring", 00:16:25.263 "raid_level": "raid1", 00:16:25.263 "superblock": false, 00:16:25.263 "num_base_bdevs": 3, 00:16:25.263 "num_base_bdevs_discovered": 2, 00:16:25.263 "num_base_bdevs_operational": 3, 00:16:25.263 "base_bdevs_list": [ 00:16:25.263 { 00:16:25.263 "name": null, 00:16:25.263 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:25.263 "is_configured": false, 00:16:25.263 "data_offset": 0, 00:16:25.263 "data_size": 65536 00:16:25.263 }, 00:16:25.263 { 00:16:25.263 "name": "BaseBdev2", 00:16:25.263 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:25.263 "is_configured": true, 00:16:25.263 "data_offset": 0, 00:16:25.263 "data_size": 65536 00:16:25.263 }, 00:16:25.263 { 00:16:25.263 "name": "BaseBdev3", 00:16:25.263 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:25.263 "is_configured": true, 00:16:25.263 "data_offset": 0, 00:16:25.263 "data_size": 65536 00:16:25.263 } 00:16:25.263 ] 00:16:25.263 }' 00:16:25.263 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.263 09:20:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.829 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.829 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:26.087 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:26.087 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.087 09:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:26.345 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0c94ef58-71dc-46fe-823a-d23d17642c6c 00:16:26.603 [2024-07-15 09:20:35.450274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:26.603 [2024-07-15 09:20:35.450316] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11f9e40 00:16:26.603 [2024-07-15 09:20:35.450325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:26.603 [2024-07-15 09:20:35.450518] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f6e60 00:16:26.603 [2024-07-15 09:20:35.450642] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11f9e40 00:16:26.603 [2024-07-15 09:20:35.450652] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11f9e40 00:16:26.603 [2024-07-15 09:20:35.450829] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:26.603 NewBaseBdev 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:26.603 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:26.860 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:27.119 [ 00:16:27.119 { 00:16:27.119 "name": "NewBaseBdev", 00:16:27.119 "aliases": [ 00:16:27.119 "0c94ef58-71dc-46fe-823a-d23d17642c6c" 00:16:27.119 ], 00:16:27.119 "product_name": "Malloc disk", 00:16:27.119 "block_size": 512, 00:16:27.119 "num_blocks": 65536, 00:16:27.119 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:27.119 "assigned_rate_limits": { 00:16:27.119 "rw_ios_per_sec": 0, 00:16:27.119 "rw_mbytes_per_sec": 0, 00:16:27.119 "r_mbytes_per_sec": 0, 00:16:27.119 "w_mbytes_per_sec": 0 00:16:27.119 }, 00:16:27.119 "claimed": true, 00:16:27.119 "claim_type": "exclusive_write", 00:16:27.119 "zoned": false, 00:16:27.119 "supported_io_types": { 00:16:27.119 "read": true, 00:16:27.119 "write": true, 00:16:27.119 "unmap": true, 00:16:27.119 "flush": true, 00:16:27.119 "reset": true, 00:16:27.119 "nvme_admin": false, 00:16:27.119 "nvme_io": false, 00:16:27.119 "nvme_io_md": false, 00:16:27.119 "write_zeroes": true, 00:16:27.119 "zcopy": true, 00:16:27.119 "get_zone_info": false, 00:16:27.119 "zone_management": false, 00:16:27.119 "zone_append": false, 00:16:27.119 "compare": false, 00:16:27.119 "compare_and_write": false, 00:16:27.119 "abort": true, 00:16:27.119 "seek_hole": false, 00:16:27.119 "seek_data": false, 00:16:27.119 "copy": true, 00:16:27.119 "nvme_iov_md": false 00:16:27.119 }, 00:16:27.119 "memory_domains": [ 00:16:27.119 { 00:16:27.119 "dma_device_id": "system", 00:16:27.119 "dma_device_type": 1 00:16:27.119 }, 00:16:27.119 { 00:16:27.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.119 "dma_device_type": 2 00:16:27.119 } 00:16:27.119 ], 00:16:27.119 "driver_specific": {} 00:16:27.119 } 00:16:27.119 ] 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.119 09:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.377 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.377 "name": "Existed_Raid", 00:16:27.377 "uuid": "12f406cc-44bc-4b3c-8155-2ab795d610fc", 00:16:27.377 "strip_size_kb": 0, 00:16:27.377 "state": "online", 00:16:27.377 "raid_level": "raid1", 00:16:27.377 "superblock": false, 00:16:27.377 "num_base_bdevs": 3, 00:16:27.377 "num_base_bdevs_discovered": 3, 00:16:27.377 "num_base_bdevs_operational": 3, 00:16:27.377 "base_bdevs_list": [ 00:16:27.377 { 00:16:27.377 "name": "NewBaseBdev", 00:16:27.377 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:27.377 "is_configured": true, 00:16:27.377 "data_offset": 0, 00:16:27.377 "data_size": 65536 00:16:27.377 }, 00:16:27.377 { 00:16:27.377 "name": "BaseBdev2", 00:16:27.377 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:27.377 "is_configured": true, 00:16:27.377 "data_offset": 0, 00:16:27.377 "data_size": 65536 00:16:27.377 }, 00:16:27.377 { 00:16:27.377 "name": "BaseBdev3", 00:16:27.377 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:27.377 "is_configured": true, 00:16:27.377 "data_offset": 0, 00:16:27.377 "data_size": 65536 00:16:27.377 } 00:16:27.377 ] 00:16:27.377 }' 00:16:27.377 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.377 09:20:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:27.943 09:20:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:28.201 [2024-07-15 09:20:36.998727] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.201 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:28.201 "name": "Existed_Raid", 00:16:28.201 "aliases": [ 00:16:28.201 "12f406cc-44bc-4b3c-8155-2ab795d610fc" 00:16:28.201 ], 00:16:28.201 "product_name": "Raid Volume", 00:16:28.201 "block_size": 512, 00:16:28.201 "num_blocks": 65536, 00:16:28.201 "uuid": "12f406cc-44bc-4b3c-8155-2ab795d610fc", 00:16:28.201 "assigned_rate_limits": { 00:16:28.201 "rw_ios_per_sec": 0, 00:16:28.201 "rw_mbytes_per_sec": 0, 00:16:28.201 "r_mbytes_per_sec": 0, 00:16:28.201 "w_mbytes_per_sec": 0 00:16:28.201 }, 00:16:28.201 "claimed": false, 00:16:28.201 "zoned": false, 00:16:28.201 "supported_io_types": { 00:16:28.201 "read": true, 00:16:28.201 "write": true, 00:16:28.201 "unmap": false, 00:16:28.201 "flush": false, 00:16:28.201 "reset": true, 00:16:28.201 "nvme_admin": false, 00:16:28.201 "nvme_io": false, 00:16:28.201 "nvme_io_md": false, 00:16:28.201 "write_zeroes": true, 00:16:28.201 "zcopy": false, 00:16:28.201 "get_zone_info": false, 00:16:28.201 "zone_management": false, 00:16:28.201 "zone_append": false, 00:16:28.201 "compare": false, 00:16:28.201 "compare_and_write": false, 00:16:28.201 "abort": false, 00:16:28.201 "seek_hole": false, 00:16:28.201 "seek_data": false, 00:16:28.201 "copy": false, 00:16:28.201 "nvme_iov_md": false 00:16:28.201 }, 00:16:28.201 "memory_domains": [ 00:16:28.201 { 00:16:28.201 "dma_device_id": "system", 00:16:28.201 "dma_device_type": 1 00:16:28.201 }, 00:16:28.201 { 00:16:28.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.201 "dma_device_type": 2 00:16:28.201 }, 00:16:28.201 { 00:16:28.201 "dma_device_id": "system", 00:16:28.201 "dma_device_type": 1 00:16:28.201 }, 00:16:28.201 { 00:16:28.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.201 "dma_device_type": 2 00:16:28.201 }, 00:16:28.201 { 00:16:28.201 "dma_device_id": "system", 00:16:28.201 "dma_device_type": 1 00:16:28.201 }, 00:16:28.201 { 00:16:28.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.201 "dma_device_type": 2 00:16:28.201 } 00:16:28.201 ], 00:16:28.201 "driver_specific": { 00:16:28.201 "raid": { 00:16:28.201 "uuid": "12f406cc-44bc-4b3c-8155-2ab795d610fc", 00:16:28.202 "strip_size_kb": 0, 00:16:28.202 "state": "online", 00:16:28.202 "raid_level": "raid1", 00:16:28.202 "superblock": false, 00:16:28.202 "num_base_bdevs": 3, 00:16:28.202 "num_base_bdevs_discovered": 3, 00:16:28.202 "num_base_bdevs_operational": 3, 00:16:28.202 "base_bdevs_list": [ 00:16:28.202 { 00:16:28.202 "name": "NewBaseBdev", 00:16:28.202 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:28.202 "is_configured": true, 00:16:28.202 "data_offset": 0, 00:16:28.202 "data_size": 65536 00:16:28.202 }, 00:16:28.202 { 00:16:28.202 "name": "BaseBdev2", 00:16:28.202 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:28.202 "is_configured": true, 00:16:28.202 "data_offset": 0, 00:16:28.202 "data_size": 65536 00:16:28.202 }, 00:16:28.202 { 00:16:28.202 "name": "BaseBdev3", 00:16:28.202 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:28.202 "is_configured": true, 00:16:28.202 "data_offset": 0, 00:16:28.202 "data_size": 65536 00:16:28.202 } 00:16:28.202 ] 00:16:28.202 } 00:16:28.202 } 00:16:28.202 }' 00:16:28.202 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:28.202 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:28.202 BaseBdev2 00:16:28.202 BaseBdev3' 00:16:28.202 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.202 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:28.202 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.461 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:28.461 "name": "NewBaseBdev", 00:16:28.461 "aliases": [ 00:16:28.461 "0c94ef58-71dc-46fe-823a-d23d17642c6c" 00:16:28.461 ], 00:16:28.461 "product_name": "Malloc disk", 00:16:28.461 "block_size": 512, 00:16:28.461 "num_blocks": 65536, 00:16:28.461 "uuid": "0c94ef58-71dc-46fe-823a-d23d17642c6c", 00:16:28.461 "assigned_rate_limits": { 00:16:28.461 "rw_ios_per_sec": 0, 00:16:28.461 "rw_mbytes_per_sec": 0, 00:16:28.461 "r_mbytes_per_sec": 0, 00:16:28.461 "w_mbytes_per_sec": 0 00:16:28.461 }, 00:16:28.461 "claimed": true, 00:16:28.461 "claim_type": "exclusive_write", 00:16:28.461 "zoned": false, 00:16:28.461 "supported_io_types": { 00:16:28.461 "read": true, 00:16:28.461 "write": true, 00:16:28.461 "unmap": true, 00:16:28.461 "flush": true, 00:16:28.461 "reset": true, 00:16:28.461 "nvme_admin": false, 00:16:28.461 "nvme_io": false, 00:16:28.461 "nvme_io_md": false, 00:16:28.461 "write_zeroes": true, 00:16:28.461 "zcopy": true, 00:16:28.461 "get_zone_info": false, 00:16:28.461 "zone_management": false, 00:16:28.461 "zone_append": false, 00:16:28.461 "compare": false, 00:16:28.461 "compare_and_write": false, 00:16:28.461 "abort": true, 00:16:28.461 "seek_hole": false, 00:16:28.461 "seek_data": false, 00:16:28.461 "copy": true, 00:16:28.461 "nvme_iov_md": false 00:16:28.461 }, 00:16:28.461 "memory_domains": [ 00:16:28.461 { 00:16:28.461 "dma_device_id": "system", 00:16:28.461 "dma_device_type": 1 00:16:28.461 }, 00:16:28.461 { 00:16:28.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.461 "dma_device_type": 2 00:16:28.461 } 00:16:28.461 ], 00:16:28.461 "driver_specific": {} 00:16:28.461 }' 00:16:28.461 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.461 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.461 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:28.461 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:28.719 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.978 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:28.978 "name": "BaseBdev2", 00:16:28.978 "aliases": [ 00:16:28.978 "2ec15916-c76d-48ac-bffa-b96503913406" 00:16:28.978 ], 00:16:28.978 "product_name": "Malloc disk", 00:16:28.978 "block_size": 512, 00:16:28.978 "num_blocks": 65536, 00:16:28.978 "uuid": "2ec15916-c76d-48ac-bffa-b96503913406", 00:16:28.978 "assigned_rate_limits": { 00:16:28.978 "rw_ios_per_sec": 0, 00:16:28.978 "rw_mbytes_per_sec": 0, 00:16:28.978 "r_mbytes_per_sec": 0, 00:16:28.978 "w_mbytes_per_sec": 0 00:16:28.978 }, 00:16:28.978 "claimed": true, 00:16:28.978 "claim_type": "exclusive_write", 00:16:28.978 "zoned": false, 00:16:28.978 "supported_io_types": { 00:16:28.978 "read": true, 00:16:28.978 "write": true, 00:16:28.978 "unmap": true, 00:16:28.978 "flush": true, 00:16:28.978 "reset": true, 00:16:28.978 "nvme_admin": false, 00:16:28.978 "nvme_io": false, 00:16:28.978 "nvme_io_md": false, 00:16:28.978 "write_zeroes": true, 00:16:28.978 "zcopy": true, 00:16:28.978 "get_zone_info": false, 00:16:28.978 "zone_management": false, 00:16:28.978 "zone_append": false, 00:16:28.978 "compare": false, 00:16:28.978 "compare_and_write": false, 00:16:28.978 "abort": true, 00:16:28.978 "seek_hole": false, 00:16:28.978 "seek_data": false, 00:16:28.978 "copy": true, 00:16:28.978 "nvme_iov_md": false 00:16:28.978 }, 00:16:28.978 "memory_domains": [ 00:16:28.978 { 00:16:28.978 "dma_device_id": "system", 00:16:28.978 "dma_device_type": 1 00:16:28.978 }, 00:16:28.978 { 00:16:28.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.978 "dma_device_type": 2 00:16:28.978 } 00:16:28.978 ], 00:16:28.978 "driver_specific": {} 00:16:28.978 }' 00:16:28.978 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.237 09:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.237 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.495 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.495 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.495 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:29.495 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.753 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.753 "name": "BaseBdev3", 00:16:29.753 "aliases": [ 00:16:29.753 "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca" 00:16:29.753 ], 00:16:29.753 "product_name": "Malloc disk", 00:16:29.753 "block_size": 512, 00:16:29.753 "num_blocks": 65536, 00:16:29.753 "uuid": "7e2e57a7-ebf8-4fea-9dd0-d5d91cc7d6ca", 00:16:29.753 "assigned_rate_limits": { 00:16:29.753 "rw_ios_per_sec": 0, 00:16:29.753 "rw_mbytes_per_sec": 0, 00:16:29.753 "r_mbytes_per_sec": 0, 00:16:29.753 "w_mbytes_per_sec": 0 00:16:29.753 }, 00:16:29.753 "claimed": true, 00:16:29.753 "claim_type": "exclusive_write", 00:16:29.753 "zoned": false, 00:16:29.753 "supported_io_types": { 00:16:29.753 "read": true, 00:16:29.753 "write": true, 00:16:29.753 "unmap": true, 00:16:29.753 "flush": true, 00:16:29.753 "reset": true, 00:16:29.753 "nvme_admin": false, 00:16:29.753 "nvme_io": false, 00:16:29.753 "nvme_io_md": false, 00:16:29.753 "write_zeroes": true, 00:16:29.753 "zcopy": true, 00:16:29.753 "get_zone_info": false, 00:16:29.753 "zone_management": false, 00:16:29.753 "zone_append": false, 00:16:29.753 "compare": false, 00:16:29.753 "compare_and_write": false, 00:16:29.753 "abort": true, 00:16:29.753 "seek_hole": false, 00:16:29.753 "seek_data": false, 00:16:29.753 "copy": true, 00:16:29.753 "nvme_iov_md": false 00:16:29.753 }, 00:16:29.753 "memory_domains": [ 00:16:29.753 { 00:16:29.753 "dma_device_id": "system", 00:16:29.753 "dma_device_type": 1 00:16:29.753 }, 00:16:29.753 { 00:16:29.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.754 "dma_device_type": 2 00:16:29.754 } 00:16:29.754 ], 00:16:29.754 "driver_specific": {} 00:16:29.754 }' 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.754 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.012 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.012 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.012 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:30.012 [2024-07-15 09:20:38.963638] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:30.012 [2024-07-15 09:20:38.963670] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:30.012 [2024-07-15 09:20:38.963726] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:30.012 [2024-07-15 09:20:38.964002] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:30.012 [2024-07-15 09:20:38.964016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f9e40 name Existed_Raid, state offline 00:16:30.270 09:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 130956 00:16:30.270 09:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 130956 ']' 00:16:30.270 09:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 130956 00:16:30.270 09:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:30.271 09:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:30.271 09:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 130956 00:16:30.271 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:30.271 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:30.271 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 130956' 00:16:30.271 killing process with pid 130956 00:16:30.271 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 130956 00:16:30.271 [2024-07-15 09:20:39.027017] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:30.271 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 130956 00:16:30.271 [2024-07-15 09:20:39.057854] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:30.529 00:16:30.529 real 0m27.778s 00:16:30.529 user 0m50.899s 00:16:30.529 sys 0m5.051s 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.529 ************************************ 00:16:30.529 END TEST raid_state_function_test 00:16:30.529 ************************************ 00:16:30.529 09:20:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:30.529 09:20:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:30.529 09:20:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:30.529 09:20:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:30.529 09:20:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:30.529 ************************************ 00:16:30.529 START TEST raid_state_function_test_sb 00:16:30.529 ************************************ 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=135084 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 135084' 00:16:30.529 Process raid pid: 135084 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 135084 /var/tmp/spdk-raid.sock 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 135084 ']' 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:30.529 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:30.529 09:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.529 [2024-07-15 09:20:39.437906] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:16:30.529 [2024-07-15 09:20:39.437980] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:30.788 [2024-07-15 09:20:39.571220] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:30.788 [2024-07-15 09:20:39.682973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:31.047 [2024-07-15 09:20:39.748869] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.047 [2024-07-15 09:20:39.748893] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.613 09:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:31.613 09:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:31.613 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:31.871 [2024-07-15 09:20:40.607595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:31.871 [2024-07-15 09:20:40.607637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:31.871 [2024-07-15 09:20:40.607648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.871 [2024-07-15 09:20:40.607659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.871 [2024-07-15 09:20:40.607668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.871 [2024-07-15 09:20:40.607679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.871 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.129 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.130 "name": "Existed_Raid", 00:16:32.130 "uuid": "2a5bbb54-e2fe-473f-9f34-768b28dc6a51", 00:16:32.130 "strip_size_kb": 0, 00:16:32.130 "state": "configuring", 00:16:32.130 "raid_level": "raid1", 00:16:32.130 "superblock": true, 00:16:32.130 "num_base_bdevs": 3, 00:16:32.130 "num_base_bdevs_discovered": 0, 00:16:32.130 "num_base_bdevs_operational": 3, 00:16:32.130 "base_bdevs_list": [ 00:16:32.130 { 00:16:32.130 "name": "BaseBdev1", 00:16:32.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.130 "is_configured": false, 00:16:32.130 "data_offset": 0, 00:16:32.130 "data_size": 0 00:16:32.130 }, 00:16:32.130 { 00:16:32.130 "name": "BaseBdev2", 00:16:32.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.130 "is_configured": false, 00:16:32.130 "data_offset": 0, 00:16:32.130 "data_size": 0 00:16:32.130 }, 00:16:32.130 { 00:16:32.130 "name": "BaseBdev3", 00:16:32.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.130 "is_configured": false, 00:16:32.130 "data_offset": 0, 00:16:32.130 "data_size": 0 00:16:32.130 } 00:16:32.130 ] 00:16:32.130 }' 00:16:32.130 09:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.130 09:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.695 09:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:32.953 [2024-07-15 09:20:41.690309] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:32.953 [2024-07-15 09:20:41.690337] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fe5a80 name Existed_Raid, state configuring 00:16:32.953 09:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:33.211 [2024-07-15 09:20:41.934977] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:33.211 [2024-07-15 09:20:41.935002] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:33.211 [2024-07-15 09:20:41.935012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:33.211 [2024-07-15 09:20:41.935023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:33.212 [2024-07-15 09:20:41.935032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:33.212 [2024-07-15 09:20:41.935042] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:33.212 09:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:33.470 [2024-07-15 09:20:42.189552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:33.470 BaseBdev1 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:33.470 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.728 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:33.986 [ 00:16:33.986 { 00:16:33.986 "name": "BaseBdev1", 00:16:33.986 "aliases": [ 00:16:33.986 "8eef8b7f-271c-4d26-986d-5f0910d796b8" 00:16:33.986 ], 00:16:33.986 "product_name": "Malloc disk", 00:16:33.986 "block_size": 512, 00:16:33.986 "num_blocks": 65536, 00:16:33.986 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:33.986 "assigned_rate_limits": { 00:16:33.986 "rw_ios_per_sec": 0, 00:16:33.986 "rw_mbytes_per_sec": 0, 00:16:33.986 "r_mbytes_per_sec": 0, 00:16:33.986 "w_mbytes_per_sec": 0 00:16:33.986 }, 00:16:33.986 "claimed": true, 00:16:33.986 "claim_type": "exclusive_write", 00:16:33.986 "zoned": false, 00:16:33.986 "supported_io_types": { 00:16:33.986 "read": true, 00:16:33.986 "write": true, 00:16:33.986 "unmap": true, 00:16:33.986 "flush": true, 00:16:33.986 "reset": true, 00:16:33.986 "nvme_admin": false, 00:16:33.986 "nvme_io": false, 00:16:33.986 "nvme_io_md": false, 00:16:33.986 "write_zeroes": true, 00:16:33.986 "zcopy": true, 00:16:33.986 "get_zone_info": false, 00:16:33.986 "zone_management": false, 00:16:33.986 "zone_append": false, 00:16:33.986 "compare": false, 00:16:33.986 "compare_and_write": false, 00:16:33.986 "abort": true, 00:16:33.986 "seek_hole": false, 00:16:33.986 "seek_data": false, 00:16:33.986 "copy": true, 00:16:33.986 "nvme_iov_md": false 00:16:33.986 }, 00:16:33.986 "memory_domains": [ 00:16:33.986 { 00:16:33.986 "dma_device_id": "system", 00:16:33.986 "dma_device_type": 1 00:16:33.986 }, 00:16:33.986 { 00:16:33.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.986 "dma_device_type": 2 00:16:33.986 } 00:16:33.986 ], 00:16:33.986 "driver_specific": {} 00:16:33.986 } 00:16:33.986 ] 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.986 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.245 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.245 "name": "Existed_Raid", 00:16:34.245 "uuid": "ccd97795-9305-45c8-b8f2-3c30772e7bbb", 00:16:34.245 "strip_size_kb": 0, 00:16:34.245 "state": "configuring", 00:16:34.245 "raid_level": "raid1", 00:16:34.245 "superblock": true, 00:16:34.245 "num_base_bdevs": 3, 00:16:34.245 "num_base_bdevs_discovered": 1, 00:16:34.245 "num_base_bdevs_operational": 3, 00:16:34.245 "base_bdevs_list": [ 00:16:34.245 { 00:16:34.245 "name": "BaseBdev1", 00:16:34.245 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:34.245 "is_configured": true, 00:16:34.245 "data_offset": 2048, 00:16:34.245 "data_size": 63488 00:16:34.245 }, 00:16:34.245 { 00:16:34.245 "name": "BaseBdev2", 00:16:34.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.245 "is_configured": false, 00:16:34.245 "data_offset": 0, 00:16:34.245 "data_size": 0 00:16:34.245 }, 00:16:34.245 { 00:16:34.245 "name": "BaseBdev3", 00:16:34.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.245 "is_configured": false, 00:16:34.245 "data_offset": 0, 00:16:34.245 "data_size": 0 00:16:34.245 } 00:16:34.245 ] 00:16:34.245 }' 00:16:34.245 09:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.245 09:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.863 09:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:34.863 [2024-07-15 09:20:43.765785] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:34.863 [2024-07-15 09:20:43.765828] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fe5310 name Existed_Raid, state configuring 00:16:34.863 09:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:35.121 [2024-07-15 09:20:44.014475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:35.121 [2024-07-15 09:20:44.015970] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:35.121 [2024-07-15 09:20:44.016002] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:35.121 [2024-07-15 09:20:44.016012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:35.121 [2024-07-15 09:20:44.016024] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.121 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.378 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.378 "name": "Existed_Raid", 00:16:35.378 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:35.378 "strip_size_kb": 0, 00:16:35.378 "state": "configuring", 00:16:35.378 "raid_level": "raid1", 00:16:35.378 "superblock": true, 00:16:35.378 "num_base_bdevs": 3, 00:16:35.378 "num_base_bdevs_discovered": 1, 00:16:35.378 "num_base_bdevs_operational": 3, 00:16:35.378 "base_bdevs_list": [ 00:16:35.378 { 00:16:35.378 "name": "BaseBdev1", 00:16:35.378 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:35.378 "is_configured": true, 00:16:35.378 "data_offset": 2048, 00:16:35.378 "data_size": 63488 00:16:35.379 }, 00:16:35.379 { 00:16:35.379 "name": "BaseBdev2", 00:16:35.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.379 "is_configured": false, 00:16:35.379 "data_offset": 0, 00:16:35.379 "data_size": 0 00:16:35.379 }, 00:16:35.379 { 00:16:35.379 "name": "BaseBdev3", 00:16:35.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.379 "is_configured": false, 00:16:35.379 "data_offset": 0, 00:16:35.379 "data_size": 0 00:16:35.379 } 00:16:35.379 ] 00:16:35.379 }' 00:16:35.379 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.379 09:20:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.944 09:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:36.201 [2024-07-15 09:20:45.108797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:36.201 BaseBdev2 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:36.201 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.459 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:36.716 [ 00:16:36.716 { 00:16:36.716 "name": "BaseBdev2", 00:16:36.716 "aliases": [ 00:16:36.716 "da7f9241-1920-41ba-8f52-cf2b0f8ccef3" 00:16:36.716 ], 00:16:36.716 "product_name": "Malloc disk", 00:16:36.716 "block_size": 512, 00:16:36.716 "num_blocks": 65536, 00:16:36.716 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:36.716 "assigned_rate_limits": { 00:16:36.716 "rw_ios_per_sec": 0, 00:16:36.716 "rw_mbytes_per_sec": 0, 00:16:36.716 "r_mbytes_per_sec": 0, 00:16:36.716 "w_mbytes_per_sec": 0 00:16:36.716 }, 00:16:36.716 "claimed": true, 00:16:36.716 "claim_type": "exclusive_write", 00:16:36.716 "zoned": false, 00:16:36.716 "supported_io_types": { 00:16:36.716 "read": true, 00:16:36.716 "write": true, 00:16:36.716 "unmap": true, 00:16:36.716 "flush": true, 00:16:36.716 "reset": true, 00:16:36.716 "nvme_admin": false, 00:16:36.716 "nvme_io": false, 00:16:36.716 "nvme_io_md": false, 00:16:36.716 "write_zeroes": true, 00:16:36.716 "zcopy": true, 00:16:36.716 "get_zone_info": false, 00:16:36.716 "zone_management": false, 00:16:36.716 "zone_append": false, 00:16:36.716 "compare": false, 00:16:36.716 "compare_and_write": false, 00:16:36.716 "abort": true, 00:16:36.716 "seek_hole": false, 00:16:36.716 "seek_data": false, 00:16:36.716 "copy": true, 00:16:36.716 "nvme_iov_md": false 00:16:36.716 }, 00:16:36.716 "memory_domains": [ 00:16:36.716 { 00:16:36.716 "dma_device_id": "system", 00:16:36.716 "dma_device_type": 1 00:16:36.716 }, 00:16:36.717 { 00:16:36.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.717 "dma_device_type": 2 00:16:36.717 } 00:16:36.717 ], 00:16:36.717 "driver_specific": {} 00:16:36.717 } 00:16:36.717 ] 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.717 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.974 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.974 "name": "Existed_Raid", 00:16:36.974 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:36.974 "strip_size_kb": 0, 00:16:36.974 "state": "configuring", 00:16:36.974 "raid_level": "raid1", 00:16:36.974 "superblock": true, 00:16:36.974 "num_base_bdevs": 3, 00:16:36.974 "num_base_bdevs_discovered": 2, 00:16:36.974 "num_base_bdevs_operational": 3, 00:16:36.974 "base_bdevs_list": [ 00:16:36.974 { 00:16:36.974 "name": "BaseBdev1", 00:16:36.974 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:36.974 "is_configured": true, 00:16:36.974 "data_offset": 2048, 00:16:36.974 "data_size": 63488 00:16:36.974 }, 00:16:36.974 { 00:16:36.974 "name": "BaseBdev2", 00:16:36.974 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:36.974 "is_configured": true, 00:16:36.974 "data_offset": 2048, 00:16:36.974 "data_size": 63488 00:16:36.974 }, 00:16:36.974 { 00:16:36.974 "name": "BaseBdev3", 00:16:36.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.974 "is_configured": false, 00:16:36.974 "data_offset": 0, 00:16:36.974 "data_size": 0 00:16:36.974 } 00:16:36.974 ] 00:16:36.974 }' 00:16:36.974 09:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.974 09:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.537 09:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:37.795 [2024-07-15 09:20:46.616206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:37.795 [2024-07-15 09:20:46.616365] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fe6400 00:16:37.795 [2024-07-15 09:20:46.616379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:37.795 [2024-07-15 09:20:46.616552] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe5ef0 00:16:37.795 [2024-07-15 09:20:46.616672] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fe6400 00:16:37.795 [2024-07-15 09:20:46.616682] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fe6400 00:16:37.795 [2024-07-15 09:20:46.616771] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:37.795 BaseBdev3 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:37.795 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.052 09:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:38.309 [ 00:16:38.309 { 00:16:38.309 "name": "BaseBdev3", 00:16:38.309 "aliases": [ 00:16:38.309 "839d68d1-ee70-410e-903a-00540ce17c1c" 00:16:38.309 ], 00:16:38.309 "product_name": "Malloc disk", 00:16:38.309 "block_size": 512, 00:16:38.309 "num_blocks": 65536, 00:16:38.309 "uuid": "839d68d1-ee70-410e-903a-00540ce17c1c", 00:16:38.309 "assigned_rate_limits": { 00:16:38.309 "rw_ios_per_sec": 0, 00:16:38.309 "rw_mbytes_per_sec": 0, 00:16:38.309 "r_mbytes_per_sec": 0, 00:16:38.309 "w_mbytes_per_sec": 0 00:16:38.309 }, 00:16:38.309 "claimed": true, 00:16:38.309 "claim_type": "exclusive_write", 00:16:38.309 "zoned": false, 00:16:38.309 "supported_io_types": { 00:16:38.309 "read": true, 00:16:38.309 "write": true, 00:16:38.309 "unmap": true, 00:16:38.309 "flush": true, 00:16:38.309 "reset": true, 00:16:38.309 "nvme_admin": false, 00:16:38.309 "nvme_io": false, 00:16:38.309 "nvme_io_md": false, 00:16:38.309 "write_zeroes": true, 00:16:38.309 "zcopy": true, 00:16:38.309 "get_zone_info": false, 00:16:38.309 "zone_management": false, 00:16:38.309 "zone_append": false, 00:16:38.309 "compare": false, 00:16:38.309 "compare_and_write": false, 00:16:38.309 "abort": true, 00:16:38.309 "seek_hole": false, 00:16:38.309 "seek_data": false, 00:16:38.309 "copy": true, 00:16:38.309 "nvme_iov_md": false 00:16:38.309 }, 00:16:38.309 "memory_domains": [ 00:16:38.309 { 00:16:38.309 "dma_device_id": "system", 00:16:38.309 "dma_device_type": 1 00:16:38.309 }, 00:16:38.309 { 00:16:38.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.309 "dma_device_type": 2 00:16:38.309 } 00:16:38.309 ], 00:16:38.309 "driver_specific": {} 00:16:38.309 } 00:16:38.309 ] 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.309 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.566 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.566 "name": "Existed_Raid", 00:16:38.566 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:38.566 "strip_size_kb": 0, 00:16:38.566 "state": "online", 00:16:38.566 "raid_level": "raid1", 00:16:38.566 "superblock": true, 00:16:38.566 "num_base_bdevs": 3, 00:16:38.566 "num_base_bdevs_discovered": 3, 00:16:38.566 "num_base_bdevs_operational": 3, 00:16:38.566 "base_bdevs_list": [ 00:16:38.566 { 00:16:38.566 "name": "BaseBdev1", 00:16:38.566 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:38.566 "is_configured": true, 00:16:38.566 "data_offset": 2048, 00:16:38.566 "data_size": 63488 00:16:38.566 }, 00:16:38.566 { 00:16:38.566 "name": "BaseBdev2", 00:16:38.566 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:38.566 "is_configured": true, 00:16:38.566 "data_offset": 2048, 00:16:38.566 "data_size": 63488 00:16:38.566 }, 00:16:38.566 { 00:16:38.566 "name": "BaseBdev3", 00:16:38.566 "uuid": "839d68d1-ee70-410e-903a-00540ce17c1c", 00:16:38.566 "is_configured": true, 00:16:38.566 "data_offset": 2048, 00:16:38.566 "data_size": 63488 00:16:38.566 } 00:16:38.566 ] 00:16:38.566 }' 00:16:38.566 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.566 09:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:39.129 09:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:39.387 [2024-07-15 09:20:48.176663] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:39.387 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:39.387 "name": "Existed_Raid", 00:16:39.387 "aliases": [ 00:16:39.387 "331b4b18-0310-4960-b700-1e4267849d95" 00:16:39.387 ], 00:16:39.387 "product_name": "Raid Volume", 00:16:39.387 "block_size": 512, 00:16:39.387 "num_blocks": 63488, 00:16:39.387 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:39.387 "assigned_rate_limits": { 00:16:39.387 "rw_ios_per_sec": 0, 00:16:39.387 "rw_mbytes_per_sec": 0, 00:16:39.387 "r_mbytes_per_sec": 0, 00:16:39.387 "w_mbytes_per_sec": 0 00:16:39.387 }, 00:16:39.387 "claimed": false, 00:16:39.387 "zoned": false, 00:16:39.387 "supported_io_types": { 00:16:39.387 "read": true, 00:16:39.387 "write": true, 00:16:39.387 "unmap": false, 00:16:39.387 "flush": false, 00:16:39.387 "reset": true, 00:16:39.387 "nvme_admin": false, 00:16:39.387 "nvme_io": false, 00:16:39.387 "nvme_io_md": false, 00:16:39.387 "write_zeroes": true, 00:16:39.387 "zcopy": false, 00:16:39.387 "get_zone_info": false, 00:16:39.387 "zone_management": false, 00:16:39.387 "zone_append": false, 00:16:39.387 "compare": false, 00:16:39.387 "compare_and_write": false, 00:16:39.387 "abort": false, 00:16:39.387 "seek_hole": false, 00:16:39.387 "seek_data": false, 00:16:39.387 "copy": false, 00:16:39.387 "nvme_iov_md": false 00:16:39.387 }, 00:16:39.387 "memory_domains": [ 00:16:39.387 { 00:16:39.387 "dma_device_id": "system", 00:16:39.387 "dma_device_type": 1 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.387 "dma_device_type": 2 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "dma_device_id": "system", 00:16:39.387 "dma_device_type": 1 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.387 "dma_device_type": 2 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "dma_device_id": "system", 00:16:39.387 "dma_device_type": 1 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.387 "dma_device_type": 2 00:16:39.387 } 00:16:39.387 ], 00:16:39.387 "driver_specific": { 00:16:39.387 "raid": { 00:16:39.387 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:39.387 "strip_size_kb": 0, 00:16:39.387 "state": "online", 00:16:39.387 "raid_level": "raid1", 00:16:39.387 "superblock": true, 00:16:39.387 "num_base_bdevs": 3, 00:16:39.387 "num_base_bdevs_discovered": 3, 00:16:39.387 "num_base_bdevs_operational": 3, 00:16:39.387 "base_bdevs_list": [ 00:16:39.387 { 00:16:39.387 "name": "BaseBdev1", 00:16:39.387 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:39.387 "is_configured": true, 00:16:39.387 "data_offset": 2048, 00:16:39.387 "data_size": 63488 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "name": "BaseBdev2", 00:16:39.387 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:39.387 "is_configured": true, 00:16:39.387 "data_offset": 2048, 00:16:39.387 "data_size": 63488 00:16:39.387 }, 00:16:39.387 { 00:16:39.387 "name": "BaseBdev3", 00:16:39.387 "uuid": "839d68d1-ee70-410e-903a-00540ce17c1c", 00:16:39.387 "is_configured": true, 00:16:39.387 "data_offset": 2048, 00:16:39.387 "data_size": 63488 00:16:39.387 } 00:16:39.387 ] 00:16:39.387 } 00:16:39.387 } 00:16:39.387 }' 00:16:39.387 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:39.387 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:39.387 BaseBdev2 00:16:39.387 BaseBdev3' 00:16:39.387 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.388 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:39.388 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.646 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.646 "name": "BaseBdev1", 00:16:39.646 "aliases": [ 00:16:39.646 "8eef8b7f-271c-4d26-986d-5f0910d796b8" 00:16:39.646 ], 00:16:39.646 "product_name": "Malloc disk", 00:16:39.646 "block_size": 512, 00:16:39.646 "num_blocks": 65536, 00:16:39.646 "uuid": "8eef8b7f-271c-4d26-986d-5f0910d796b8", 00:16:39.646 "assigned_rate_limits": { 00:16:39.646 "rw_ios_per_sec": 0, 00:16:39.646 "rw_mbytes_per_sec": 0, 00:16:39.646 "r_mbytes_per_sec": 0, 00:16:39.646 "w_mbytes_per_sec": 0 00:16:39.646 }, 00:16:39.646 "claimed": true, 00:16:39.646 "claim_type": "exclusive_write", 00:16:39.646 "zoned": false, 00:16:39.646 "supported_io_types": { 00:16:39.646 "read": true, 00:16:39.646 "write": true, 00:16:39.646 "unmap": true, 00:16:39.646 "flush": true, 00:16:39.646 "reset": true, 00:16:39.646 "nvme_admin": false, 00:16:39.646 "nvme_io": false, 00:16:39.646 "nvme_io_md": false, 00:16:39.646 "write_zeroes": true, 00:16:39.646 "zcopy": true, 00:16:39.646 "get_zone_info": false, 00:16:39.646 "zone_management": false, 00:16:39.646 "zone_append": false, 00:16:39.646 "compare": false, 00:16:39.646 "compare_and_write": false, 00:16:39.646 "abort": true, 00:16:39.646 "seek_hole": false, 00:16:39.646 "seek_data": false, 00:16:39.646 "copy": true, 00:16:39.646 "nvme_iov_md": false 00:16:39.646 }, 00:16:39.646 "memory_domains": [ 00:16:39.646 { 00:16:39.646 "dma_device_id": "system", 00:16:39.646 "dma_device_type": 1 00:16:39.646 }, 00:16:39.646 { 00:16:39.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.646 "dma_device_type": 2 00:16:39.646 } 00:16:39.646 ], 00:16:39.646 "driver_specific": {} 00:16:39.646 }' 00:16:39.646 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.646 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.904 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.161 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.161 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:40.161 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:40.161 09:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:40.419 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:40.419 "name": "BaseBdev2", 00:16:40.419 "aliases": [ 00:16:40.419 "da7f9241-1920-41ba-8f52-cf2b0f8ccef3" 00:16:40.419 ], 00:16:40.419 "product_name": "Malloc disk", 00:16:40.420 "block_size": 512, 00:16:40.420 "num_blocks": 65536, 00:16:40.420 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:40.420 "assigned_rate_limits": { 00:16:40.420 "rw_ios_per_sec": 0, 00:16:40.420 "rw_mbytes_per_sec": 0, 00:16:40.420 "r_mbytes_per_sec": 0, 00:16:40.420 "w_mbytes_per_sec": 0 00:16:40.420 }, 00:16:40.420 "claimed": true, 00:16:40.420 "claim_type": "exclusive_write", 00:16:40.420 "zoned": false, 00:16:40.420 "supported_io_types": { 00:16:40.420 "read": true, 00:16:40.420 "write": true, 00:16:40.420 "unmap": true, 00:16:40.420 "flush": true, 00:16:40.420 "reset": true, 00:16:40.420 "nvme_admin": false, 00:16:40.420 "nvme_io": false, 00:16:40.420 "nvme_io_md": false, 00:16:40.420 "write_zeroes": true, 00:16:40.420 "zcopy": true, 00:16:40.420 "get_zone_info": false, 00:16:40.420 "zone_management": false, 00:16:40.420 "zone_append": false, 00:16:40.420 "compare": false, 00:16:40.420 "compare_and_write": false, 00:16:40.420 "abort": true, 00:16:40.420 "seek_hole": false, 00:16:40.420 "seek_data": false, 00:16:40.420 "copy": true, 00:16:40.420 "nvme_iov_md": false 00:16:40.420 }, 00:16:40.420 "memory_domains": [ 00:16:40.420 { 00:16:40.420 "dma_device_id": "system", 00:16:40.420 "dma_device_type": 1 00:16:40.420 }, 00:16:40.420 { 00:16:40.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.420 "dma_device_type": 2 00:16:40.420 } 00:16:40.420 ], 00:16:40.420 "driver_specific": {} 00:16:40.420 }' 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.420 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:40.678 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:40.936 "name": "BaseBdev3", 00:16:40.936 "aliases": [ 00:16:40.936 "839d68d1-ee70-410e-903a-00540ce17c1c" 00:16:40.936 ], 00:16:40.936 "product_name": "Malloc disk", 00:16:40.936 "block_size": 512, 00:16:40.936 "num_blocks": 65536, 00:16:40.936 "uuid": "839d68d1-ee70-410e-903a-00540ce17c1c", 00:16:40.936 "assigned_rate_limits": { 00:16:40.936 "rw_ios_per_sec": 0, 00:16:40.936 "rw_mbytes_per_sec": 0, 00:16:40.936 "r_mbytes_per_sec": 0, 00:16:40.936 "w_mbytes_per_sec": 0 00:16:40.936 }, 00:16:40.936 "claimed": true, 00:16:40.936 "claim_type": "exclusive_write", 00:16:40.936 "zoned": false, 00:16:40.936 "supported_io_types": { 00:16:40.936 "read": true, 00:16:40.936 "write": true, 00:16:40.936 "unmap": true, 00:16:40.936 "flush": true, 00:16:40.936 "reset": true, 00:16:40.936 "nvme_admin": false, 00:16:40.936 "nvme_io": false, 00:16:40.936 "nvme_io_md": false, 00:16:40.936 "write_zeroes": true, 00:16:40.936 "zcopy": true, 00:16:40.936 "get_zone_info": false, 00:16:40.936 "zone_management": false, 00:16:40.936 "zone_append": false, 00:16:40.936 "compare": false, 00:16:40.936 "compare_and_write": false, 00:16:40.936 "abort": true, 00:16:40.936 "seek_hole": false, 00:16:40.936 "seek_data": false, 00:16:40.936 "copy": true, 00:16:40.936 "nvme_iov_md": false 00:16:40.936 }, 00:16:40.936 "memory_domains": [ 00:16:40.936 { 00:16:40.936 "dma_device_id": "system", 00:16:40.936 "dma_device_type": 1 00:16:40.936 }, 00:16:40.936 { 00:16:40.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.936 "dma_device_type": 2 00:16:40.936 } 00:16:40.936 ], 00:16:40.936 "driver_specific": {} 00:16:40.936 }' 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.936 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:41.194 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:41.194 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:41.194 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:41.194 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:41.194 09:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:41.194 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:41.194 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:41.194 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:41.452 [2024-07-15 09:20:50.334173] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:41.452 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:41.452 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:41.452 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.453 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.711 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.711 "name": "Existed_Raid", 00:16:41.711 "uuid": "331b4b18-0310-4960-b700-1e4267849d95", 00:16:41.711 "strip_size_kb": 0, 00:16:41.711 "state": "online", 00:16:41.711 "raid_level": "raid1", 00:16:41.711 "superblock": true, 00:16:41.711 "num_base_bdevs": 3, 00:16:41.711 "num_base_bdevs_discovered": 2, 00:16:41.711 "num_base_bdevs_operational": 2, 00:16:41.711 "base_bdevs_list": [ 00:16:41.711 { 00:16:41.711 "name": null, 00:16:41.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.711 "is_configured": false, 00:16:41.711 "data_offset": 2048, 00:16:41.711 "data_size": 63488 00:16:41.711 }, 00:16:41.711 { 00:16:41.711 "name": "BaseBdev2", 00:16:41.711 "uuid": "da7f9241-1920-41ba-8f52-cf2b0f8ccef3", 00:16:41.711 "is_configured": true, 00:16:41.711 "data_offset": 2048, 00:16:41.711 "data_size": 63488 00:16:41.711 }, 00:16:41.711 { 00:16:41.711 "name": "BaseBdev3", 00:16:41.711 "uuid": "839d68d1-ee70-410e-903a-00540ce17c1c", 00:16:41.711 "is_configured": true, 00:16:41.711 "data_offset": 2048, 00:16:41.711 "data_size": 63488 00:16:41.711 } 00:16:41.711 ] 00:16:41.711 }' 00:16:41.711 09:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.711 09:20:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.277 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:42.277 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:42.277 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.277 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:42.535 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:42.535 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:42.535 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:43.116 [2024-07-15 09:20:51.915443] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:43.116 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:43.116 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:43.116 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.116 09:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:43.374 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:43.374 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:43.374 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:43.941 [2024-07-15 09:20:52.690059] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:43.941 [2024-07-15 09:20:52.690148] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:43.941 [2024-07-15 09:20:52.701498] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:43.941 [2024-07-15 09:20:52.701531] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:43.941 [2024-07-15 09:20:52.701543] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fe6400 name Existed_Raid, state offline 00:16:43.941 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:43.941 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:43.941 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.941 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:44.199 09:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:44.469 BaseBdev2 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:44.469 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:44.731 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:44.989 [ 00:16:44.989 { 00:16:44.989 "name": "BaseBdev2", 00:16:44.989 "aliases": [ 00:16:44.989 "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2" 00:16:44.989 ], 00:16:44.990 "product_name": "Malloc disk", 00:16:44.990 "block_size": 512, 00:16:44.990 "num_blocks": 65536, 00:16:44.990 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:44.990 "assigned_rate_limits": { 00:16:44.990 "rw_ios_per_sec": 0, 00:16:44.990 "rw_mbytes_per_sec": 0, 00:16:44.990 "r_mbytes_per_sec": 0, 00:16:44.990 "w_mbytes_per_sec": 0 00:16:44.990 }, 00:16:44.990 "claimed": false, 00:16:44.990 "zoned": false, 00:16:44.990 "supported_io_types": { 00:16:44.990 "read": true, 00:16:44.990 "write": true, 00:16:44.990 "unmap": true, 00:16:44.990 "flush": true, 00:16:44.990 "reset": true, 00:16:44.990 "nvme_admin": false, 00:16:44.990 "nvme_io": false, 00:16:44.990 "nvme_io_md": false, 00:16:44.990 "write_zeroes": true, 00:16:44.990 "zcopy": true, 00:16:44.990 "get_zone_info": false, 00:16:44.990 "zone_management": false, 00:16:44.990 "zone_append": false, 00:16:44.990 "compare": false, 00:16:44.990 "compare_and_write": false, 00:16:44.990 "abort": true, 00:16:44.990 "seek_hole": false, 00:16:44.990 "seek_data": false, 00:16:44.990 "copy": true, 00:16:44.990 "nvme_iov_md": false 00:16:44.990 }, 00:16:44.990 "memory_domains": [ 00:16:44.990 { 00:16:44.990 "dma_device_id": "system", 00:16:44.990 "dma_device_type": 1 00:16:44.990 }, 00:16:44.990 { 00:16:44.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.990 "dma_device_type": 2 00:16:44.990 } 00:16:44.990 ], 00:16:44.990 "driver_specific": {} 00:16:44.990 } 00:16:44.990 ] 00:16:44.990 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:44.990 09:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:44.990 09:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:44.990 09:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:44.990 BaseBdev3 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.248 09:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.248 09:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:45.507 [ 00:16:45.507 { 00:16:45.507 "name": "BaseBdev3", 00:16:45.507 "aliases": [ 00:16:45.507 "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4" 00:16:45.507 ], 00:16:45.507 "product_name": "Malloc disk", 00:16:45.507 "block_size": 512, 00:16:45.507 "num_blocks": 65536, 00:16:45.507 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:45.507 "assigned_rate_limits": { 00:16:45.507 "rw_ios_per_sec": 0, 00:16:45.507 "rw_mbytes_per_sec": 0, 00:16:45.507 "r_mbytes_per_sec": 0, 00:16:45.507 "w_mbytes_per_sec": 0 00:16:45.507 }, 00:16:45.507 "claimed": false, 00:16:45.507 "zoned": false, 00:16:45.507 "supported_io_types": { 00:16:45.507 "read": true, 00:16:45.507 "write": true, 00:16:45.507 "unmap": true, 00:16:45.507 "flush": true, 00:16:45.507 "reset": true, 00:16:45.507 "nvme_admin": false, 00:16:45.507 "nvme_io": false, 00:16:45.507 "nvme_io_md": false, 00:16:45.507 "write_zeroes": true, 00:16:45.507 "zcopy": true, 00:16:45.507 "get_zone_info": false, 00:16:45.507 "zone_management": false, 00:16:45.507 "zone_append": false, 00:16:45.507 "compare": false, 00:16:45.507 "compare_and_write": false, 00:16:45.507 "abort": true, 00:16:45.507 "seek_hole": false, 00:16:45.507 "seek_data": false, 00:16:45.507 "copy": true, 00:16:45.507 "nvme_iov_md": false 00:16:45.507 }, 00:16:45.507 "memory_domains": [ 00:16:45.507 { 00:16:45.507 "dma_device_id": "system", 00:16:45.507 "dma_device_type": 1 00:16:45.507 }, 00:16:45.507 { 00:16:45.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.507 "dma_device_type": 2 00:16:45.507 } 00:16:45.507 ], 00:16:45.507 "driver_specific": {} 00:16:45.507 } 00:16:45.507 ] 00:16:45.507 09:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:45.507 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:45.507 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:45.507 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:45.766 [2024-07-15 09:20:54.651162] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:45.766 [2024-07-15 09:20:54.651202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:45.766 [2024-07-15 09:20:54.651222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:45.766 [2024-07-15 09:20:54.652582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.766 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.024 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.024 "name": "Existed_Raid", 00:16:46.024 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:46.024 "strip_size_kb": 0, 00:16:46.024 "state": "configuring", 00:16:46.024 "raid_level": "raid1", 00:16:46.024 "superblock": true, 00:16:46.024 "num_base_bdevs": 3, 00:16:46.024 "num_base_bdevs_discovered": 2, 00:16:46.024 "num_base_bdevs_operational": 3, 00:16:46.024 "base_bdevs_list": [ 00:16:46.024 { 00:16:46.024 "name": "BaseBdev1", 00:16:46.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.024 "is_configured": false, 00:16:46.024 "data_offset": 0, 00:16:46.024 "data_size": 0 00:16:46.024 }, 00:16:46.024 { 00:16:46.024 "name": "BaseBdev2", 00:16:46.025 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:46.025 "is_configured": true, 00:16:46.025 "data_offset": 2048, 00:16:46.025 "data_size": 63488 00:16:46.025 }, 00:16:46.025 { 00:16:46.025 "name": "BaseBdev3", 00:16:46.025 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:46.025 "is_configured": true, 00:16:46.025 "data_offset": 2048, 00:16:46.025 "data_size": 63488 00:16:46.025 } 00:16:46.025 ] 00:16:46.025 }' 00:16:46.025 09:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.025 09:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.591 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:46.849 [2024-07-15 09:20:55.725968] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.849 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.107 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.107 "name": "Existed_Raid", 00:16:47.107 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:47.107 "strip_size_kb": 0, 00:16:47.107 "state": "configuring", 00:16:47.107 "raid_level": "raid1", 00:16:47.107 "superblock": true, 00:16:47.107 "num_base_bdevs": 3, 00:16:47.107 "num_base_bdevs_discovered": 1, 00:16:47.107 "num_base_bdevs_operational": 3, 00:16:47.107 "base_bdevs_list": [ 00:16:47.107 { 00:16:47.107 "name": "BaseBdev1", 00:16:47.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.107 "is_configured": false, 00:16:47.107 "data_offset": 0, 00:16:47.107 "data_size": 0 00:16:47.107 }, 00:16:47.107 { 00:16:47.107 "name": null, 00:16:47.107 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:47.107 "is_configured": false, 00:16:47.107 "data_offset": 2048, 00:16:47.107 "data_size": 63488 00:16:47.107 }, 00:16:47.107 { 00:16:47.107 "name": "BaseBdev3", 00:16:47.107 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:47.107 "is_configured": true, 00:16:47.107 "data_offset": 2048, 00:16:47.107 "data_size": 63488 00:16:47.107 } 00:16:47.107 ] 00:16:47.107 }' 00:16:47.107 09:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.107 09:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.673 09:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.673 09:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:47.932 09:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:47.932 09:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:48.190 [2024-07-15 09:20:57.074276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:48.190 BaseBdev1 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.190 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.448 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:48.754 [ 00:16:48.754 { 00:16:48.754 "name": "BaseBdev1", 00:16:48.754 "aliases": [ 00:16:48.754 "5a5eda46-d4c9-4298-838b-abd95978f506" 00:16:48.754 ], 00:16:48.754 "product_name": "Malloc disk", 00:16:48.754 "block_size": 512, 00:16:48.754 "num_blocks": 65536, 00:16:48.754 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:48.754 "assigned_rate_limits": { 00:16:48.754 "rw_ios_per_sec": 0, 00:16:48.754 "rw_mbytes_per_sec": 0, 00:16:48.754 "r_mbytes_per_sec": 0, 00:16:48.754 "w_mbytes_per_sec": 0 00:16:48.754 }, 00:16:48.754 "claimed": true, 00:16:48.754 "claim_type": "exclusive_write", 00:16:48.754 "zoned": false, 00:16:48.754 "supported_io_types": { 00:16:48.754 "read": true, 00:16:48.754 "write": true, 00:16:48.754 "unmap": true, 00:16:48.754 "flush": true, 00:16:48.754 "reset": true, 00:16:48.754 "nvme_admin": false, 00:16:48.754 "nvme_io": false, 00:16:48.754 "nvme_io_md": false, 00:16:48.754 "write_zeroes": true, 00:16:48.754 "zcopy": true, 00:16:48.754 "get_zone_info": false, 00:16:48.754 "zone_management": false, 00:16:48.754 "zone_append": false, 00:16:48.754 "compare": false, 00:16:48.754 "compare_and_write": false, 00:16:48.754 "abort": true, 00:16:48.754 "seek_hole": false, 00:16:48.754 "seek_data": false, 00:16:48.754 "copy": true, 00:16:48.754 "nvme_iov_md": false 00:16:48.754 }, 00:16:48.754 "memory_domains": [ 00:16:48.754 { 00:16:48.754 "dma_device_id": "system", 00:16:48.754 "dma_device_type": 1 00:16:48.754 }, 00:16:48.754 { 00:16:48.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.754 "dma_device_type": 2 00:16:48.754 } 00:16:48.754 ], 00:16:48.754 "driver_specific": {} 00:16:48.754 } 00:16:48.754 ] 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.754 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.026 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.026 "name": "Existed_Raid", 00:16:49.026 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:49.026 "strip_size_kb": 0, 00:16:49.026 "state": "configuring", 00:16:49.026 "raid_level": "raid1", 00:16:49.026 "superblock": true, 00:16:49.026 "num_base_bdevs": 3, 00:16:49.026 "num_base_bdevs_discovered": 2, 00:16:49.026 "num_base_bdevs_operational": 3, 00:16:49.027 "base_bdevs_list": [ 00:16:49.027 { 00:16:49.027 "name": "BaseBdev1", 00:16:49.027 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:49.027 "is_configured": true, 00:16:49.027 "data_offset": 2048, 00:16:49.027 "data_size": 63488 00:16:49.027 }, 00:16:49.027 { 00:16:49.027 "name": null, 00:16:49.027 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:49.027 "is_configured": false, 00:16:49.027 "data_offset": 2048, 00:16:49.027 "data_size": 63488 00:16:49.027 }, 00:16:49.027 { 00:16:49.027 "name": "BaseBdev3", 00:16:49.027 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:49.027 "is_configured": true, 00:16:49.027 "data_offset": 2048, 00:16:49.027 "data_size": 63488 00:16:49.027 } 00:16:49.027 ] 00:16:49.027 }' 00:16:49.027 09:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.027 09:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.594 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:49.594 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.852 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:49.852 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:50.110 [2024-07-15 09:20:58.875109] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.110 09:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.368 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.368 "name": "Existed_Raid", 00:16:50.368 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:50.368 "strip_size_kb": 0, 00:16:50.368 "state": "configuring", 00:16:50.368 "raid_level": "raid1", 00:16:50.368 "superblock": true, 00:16:50.368 "num_base_bdevs": 3, 00:16:50.368 "num_base_bdevs_discovered": 1, 00:16:50.368 "num_base_bdevs_operational": 3, 00:16:50.368 "base_bdevs_list": [ 00:16:50.368 { 00:16:50.368 "name": "BaseBdev1", 00:16:50.368 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:50.368 "is_configured": true, 00:16:50.368 "data_offset": 2048, 00:16:50.368 "data_size": 63488 00:16:50.368 }, 00:16:50.368 { 00:16:50.368 "name": null, 00:16:50.368 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:50.368 "is_configured": false, 00:16:50.368 "data_offset": 2048, 00:16:50.368 "data_size": 63488 00:16:50.368 }, 00:16:50.368 { 00:16:50.368 "name": null, 00:16:50.368 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:50.368 "is_configured": false, 00:16:50.368 "data_offset": 2048, 00:16:50.368 "data_size": 63488 00:16:50.368 } 00:16:50.368 ] 00:16:50.368 }' 00:16:50.368 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.368 09:20:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.934 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.934 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:51.193 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:51.193 09:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:51.451 [2024-07-15 09:21:00.206677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.451 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.710 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.710 "name": "Existed_Raid", 00:16:51.710 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:51.710 "strip_size_kb": 0, 00:16:51.710 "state": "configuring", 00:16:51.710 "raid_level": "raid1", 00:16:51.710 "superblock": true, 00:16:51.710 "num_base_bdevs": 3, 00:16:51.710 "num_base_bdevs_discovered": 2, 00:16:51.710 "num_base_bdevs_operational": 3, 00:16:51.710 "base_bdevs_list": [ 00:16:51.710 { 00:16:51.710 "name": "BaseBdev1", 00:16:51.710 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:51.710 "is_configured": true, 00:16:51.710 "data_offset": 2048, 00:16:51.710 "data_size": 63488 00:16:51.710 }, 00:16:51.710 { 00:16:51.710 "name": null, 00:16:51.710 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:51.710 "is_configured": false, 00:16:51.710 "data_offset": 2048, 00:16:51.710 "data_size": 63488 00:16:51.710 }, 00:16:51.710 { 00:16:51.710 "name": "BaseBdev3", 00:16:51.710 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:51.710 "is_configured": true, 00:16:51.710 "data_offset": 2048, 00:16:51.710 "data_size": 63488 00:16:51.710 } 00:16:51.710 ] 00:16:51.710 }' 00:16:51.710 09:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.710 09:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:52.278 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:52.278 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.537 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:52.537 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:52.796 [2024-07-15 09:21:01.530197] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.796 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.055 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.055 "name": "Existed_Raid", 00:16:53.055 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:53.055 "strip_size_kb": 0, 00:16:53.055 "state": "configuring", 00:16:53.055 "raid_level": "raid1", 00:16:53.055 "superblock": true, 00:16:53.055 "num_base_bdevs": 3, 00:16:53.055 "num_base_bdevs_discovered": 1, 00:16:53.055 "num_base_bdevs_operational": 3, 00:16:53.055 "base_bdevs_list": [ 00:16:53.055 { 00:16:53.055 "name": null, 00:16:53.055 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:53.055 "is_configured": false, 00:16:53.055 "data_offset": 2048, 00:16:53.055 "data_size": 63488 00:16:53.055 }, 00:16:53.055 { 00:16:53.055 "name": null, 00:16:53.055 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:53.055 "is_configured": false, 00:16:53.055 "data_offset": 2048, 00:16:53.055 "data_size": 63488 00:16:53.055 }, 00:16:53.055 { 00:16:53.055 "name": "BaseBdev3", 00:16:53.055 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:53.055 "is_configured": true, 00:16:53.055 "data_offset": 2048, 00:16:53.055 "data_size": 63488 00:16:53.055 } 00:16:53.055 ] 00:16:53.055 }' 00:16:53.055 09:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.055 09:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.622 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.622 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:53.881 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:53.881 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:54.140 [2024-07-15 09:21:02.877443] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.140 09:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.398 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.398 "name": "Existed_Raid", 00:16:54.398 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:54.398 "strip_size_kb": 0, 00:16:54.398 "state": "configuring", 00:16:54.398 "raid_level": "raid1", 00:16:54.398 "superblock": true, 00:16:54.398 "num_base_bdevs": 3, 00:16:54.398 "num_base_bdevs_discovered": 2, 00:16:54.398 "num_base_bdevs_operational": 3, 00:16:54.398 "base_bdevs_list": [ 00:16:54.398 { 00:16:54.398 "name": null, 00:16:54.398 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:54.398 "is_configured": false, 00:16:54.398 "data_offset": 2048, 00:16:54.398 "data_size": 63488 00:16:54.398 }, 00:16:54.398 { 00:16:54.398 "name": "BaseBdev2", 00:16:54.398 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:54.398 "is_configured": true, 00:16:54.398 "data_offset": 2048, 00:16:54.398 "data_size": 63488 00:16:54.398 }, 00:16:54.398 { 00:16:54.398 "name": "BaseBdev3", 00:16:54.398 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:54.398 "is_configured": true, 00:16:54.398 "data_offset": 2048, 00:16:54.398 "data_size": 63488 00:16:54.398 } 00:16:54.398 ] 00:16:54.398 }' 00:16:54.398 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.398 09:21:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:54.963 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:54.963 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.222 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:55.222 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.222 09:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:55.480 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5a5eda46-d4c9-4298-838b-abd95978f506 00:16:55.480 [2024-07-15 09:21:04.426129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:55.480 [2024-07-15 09:21:04.426280] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fdc1b0 00:16:55.480 [2024-07-15 09:21:04.426294] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:55.480 [2024-07-15 09:21:04.426465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21984f0 00:16:55.480 [2024-07-15 09:21:04.426584] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fdc1b0 00:16:55.481 [2024-07-15 09:21:04.426594] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fdc1b0 00:16:55.481 [2024-07-15 09:21:04.426697] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:55.481 NewBaseBdev 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.739 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:55.999 [ 00:16:55.999 { 00:16:55.999 "name": "NewBaseBdev", 00:16:55.999 "aliases": [ 00:16:55.999 "5a5eda46-d4c9-4298-838b-abd95978f506" 00:16:55.999 ], 00:16:55.999 "product_name": "Malloc disk", 00:16:55.999 "block_size": 512, 00:16:55.999 "num_blocks": 65536, 00:16:55.999 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:55.999 "assigned_rate_limits": { 00:16:55.999 "rw_ios_per_sec": 0, 00:16:55.999 "rw_mbytes_per_sec": 0, 00:16:55.999 "r_mbytes_per_sec": 0, 00:16:55.999 "w_mbytes_per_sec": 0 00:16:55.999 }, 00:16:55.999 "claimed": true, 00:16:55.999 "claim_type": "exclusive_write", 00:16:55.999 "zoned": false, 00:16:55.999 "supported_io_types": { 00:16:55.999 "read": true, 00:16:55.999 "write": true, 00:16:55.999 "unmap": true, 00:16:55.999 "flush": true, 00:16:55.999 "reset": true, 00:16:55.999 "nvme_admin": false, 00:16:55.999 "nvme_io": false, 00:16:55.999 "nvme_io_md": false, 00:16:55.999 "write_zeroes": true, 00:16:55.999 "zcopy": true, 00:16:55.999 "get_zone_info": false, 00:16:55.999 "zone_management": false, 00:16:55.999 "zone_append": false, 00:16:55.999 "compare": false, 00:16:55.999 "compare_and_write": false, 00:16:55.999 "abort": true, 00:16:55.999 "seek_hole": false, 00:16:55.999 "seek_data": false, 00:16:55.999 "copy": true, 00:16:55.999 "nvme_iov_md": false 00:16:55.999 }, 00:16:55.999 "memory_domains": [ 00:16:55.999 { 00:16:55.999 "dma_device_id": "system", 00:16:55.999 "dma_device_type": 1 00:16:55.999 }, 00:16:55.999 { 00:16:55.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.999 "dma_device_type": 2 00:16:55.999 } 00:16:55.999 ], 00:16:55.999 "driver_specific": {} 00:16:55.999 } 00:16:55.999 ] 00:16:55.999 09:21:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:55.999 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:55.999 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.258 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.259 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.259 09:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.259 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.259 "name": "Existed_Raid", 00:16:56.259 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:56.259 "strip_size_kb": 0, 00:16:56.259 "state": "online", 00:16:56.259 "raid_level": "raid1", 00:16:56.259 "superblock": true, 00:16:56.259 "num_base_bdevs": 3, 00:16:56.259 "num_base_bdevs_discovered": 3, 00:16:56.259 "num_base_bdevs_operational": 3, 00:16:56.259 "base_bdevs_list": [ 00:16:56.259 { 00:16:56.259 "name": "NewBaseBdev", 00:16:56.259 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:56.259 "is_configured": true, 00:16:56.259 "data_offset": 2048, 00:16:56.259 "data_size": 63488 00:16:56.259 }, 00:16:56.259 { 00:16:56.259 "name": "BaseBdev2", 00:16:56.259 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:56.259 "is_configured": true, 00:16:56.259 "data_offset": 2048, 00:16:56.259 "data_size": 63488 00:16:56.259 }, 00:16:56.259 { 00:16:56.259 "name": "BaseBdev3", 00:16:56.259 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:56.259 "is_configured": true, 00:16:56.259 "data_offset": 2048, 00:16:56.259 "data_size": 63488 00:16:56.259 } 00:16:56.259 ] 00:16:56.259 }' 00:16:56.259 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.259 09:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:57.194 09:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:57.194 [2024-07-15 09:21:06.022671] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:57.194 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:57.194 "name": "Existed_Raid", 00:16:57.194 "aliases": [ 00:16:57.194 "2021bed2-5a0c-4fbb-b750-2c8b526bda95" 00:16:57.194 ], 00:16:57.194 "product_name": "Raid Volume", 00:16:57.194 "block_size": 512, 00:16:57.194 "num_blocks": 63488, 00:16:57.194 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:57.194 "assigned_rate_limits": { 00:16:57.194 "rw_ios_per_sec": 0, 00:16:57.194 "rw_mbytes_per_sec": 0, 00:16:57.194 "r_mbytes_per_sec": 0, 00:16:57.194 "w_mbytes_per_sec": 0 00:16:57.194 }, 00:16:57.194 "claimed": false, 00:16:57.194 "zoned": false, 00:16:57.194 "supported_io_types": { 00:16:57.194 "read": true, 00:16:57.194 "write": true, 00:16:57.194 "unmap": false, 00:16:57.194 "flush": false, 00:16:57.194 "reset": true, 00:16:57.194 "nvme_admin": false, 00:16:57.194 "nvme_io": false, 00:16:57.194 "nvme_io_md": false, 00:16:57.194 "write_zeroes": true, 00:16:57.194 "zcopy": false, 00:16:57.194 "get_zone_info": false, 00:16:57.194 "zone_management": false, 00:16:57.194 "zone_append": false, 00:16:57.194 "compare": false, 00:16:57.194 "compare_and_write": false, 00:16:57.194 "abort": false, 00:16:57.194 "seek_hole": false, 00:16:57.194 "seek_data": false, 00:16:57.194 "copy": false, 00:16:57.194 "nvme_iov_md": false 00:16:57.194 }, 00:16:57.194 "memory_domains": [ 00:16:57.194 { 00:16:57.194 "dma_device_id": "system", 00:16:57.194 "dma_device_type": 1 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.194 "dma_device_type": 2 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "dma_device_id": "system", 00:16:57.194 "dma_device_type": 1 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.194 "dma_device_type": 2 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "dma_device_id": "system", 00:16:57.194 "dma_device_type": 1 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.194 "dma_device_type": 2 00:16:57.194 } 00:16:57.194 ], 00:16:57.194 "driver_specific": { 00:16:57.194 "raid": { 00:16:57.194 "uuid": "2021bed2-5a0c-4fbb-b750-2c8b526bda95", 00:16:57.194 "strip_size_kb": 0, 00:16:57.194 "state": "online", 00:16:57.194 "raid_level": "raid1", 00:16:57.194 "superblock": true, 00:16:57.194 "num_base_bdevs": 3, 00:16:57.194 "num_base_bdevs_discovered": 3, 00:16:57.194 "num_base_bdevs_operational": 3, 00:16:57.194 "base_bdevs_list": [ 00:16:57.194 { 00:16:57.194 "name": "NewBaseBdev", 00:16:57.194 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:57.194 "is_configured": true, 00:16:57.194 "data_offset": 2048, 00:16:57.194 "data_size": 63488 00:16:57.194 }, 00:16:57.194 { 00:16:57.194 "name": "BaseBdev2", 00:16:57.194 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:57.195 "is_configured": true, 00:16:57.195 "data_offset": 2048, 00:16:57.195 "data_size": 63488 00:16:57.195 }, 00:16:57.195 { 00:16:57.195 "name": "BaseBdev3", 00:16:57.195 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:57.195 "is_configured": true, 00:16:57.195 "data_offset": 2048, 00:16:57.195 "data_size": 63488 00:16:57.195 } 00:16:57.195 ] 00:16:57.195 } 00:16:57.195 } 00:16:57.195 }' 00:16:57.195 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:57.195 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:57.195 BaseBdev2 00:16:57.195 BaseBdev3' 00:16:57.195 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:57.195 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:57.195 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:57.454 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:57.454 "name": "NewBaseBdev", 00:16:57.454 "aliases": [ 00:16:57.454 "5a5eda46-d4c9-4298-838b-abd95978f506" 00:16:57.454 ], 00:16:57.454 "product_name": "Malloc disk", 00:16:57.454 "block_size": 512, 00:16:57.454 "num_blocks": 65536, 00:16:57.454 "uuid": "5a5eda46-d4c9-4298-838b-abd95978f506", 00:16:57.454 "assigned_rate_limits": { 00:16:57.454 "rw_ios_per_sec": 0, 00:16:57.454 "rw_mbytes_per_sec": 0, 00:16:57.454 "r_mbytes_per_sec": 0, 00:16:57.454 "w_mbytes_per_sec": 0 00:16:57.454 }, 00:16:57.454 "claimed": true, 00:16:57.454 "claim_type": "exclusive_write", 00:16:57.454 "zoned": false, 00:16:57.454 "supported_io_types": { 00:16:57.454 "read": true, 00:16:57.454 "write": true, 00:16:57.454 "unmap": true, 00:16:57.454 "flush": true, 00:16:57.454 "reset": true, 00:16:57.454 "nvme_admin": false, 00:16:57.454 "nvme_io": false, 00:16:57.454 "nvme_io_md": false, 00:16:57.454 "write_zeroes": true, 00:16:57.454 "zcopy": true, 00:16:57.454 "get_zone_info": false, 00:16:57.454 "zone_management": false, 00:16:57.454 "zone_append": false, 00:16:57.454 "compare": false, 00:16:57.454 "compare_and_write": false, 00:16:57.454 "abort": true, 00:16:57.454 "seek_hole": false, 00:16:57.454 "seek_data": false, 00:16:57.454 "copy": true, 00:16:57.454 "nvme_iov_md": false 00:16:57.454 }, 00:16:57.454 "memory_domains": [ 00:16:57.454 { 00:16:57.454 "dma_device_id": "system", 00:16:57.454 "dma_device_type": 1 00:16:57.454 }, 00:16:57.454 { 00:16:57.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.454 "dma_device_type": 2 00:16:57.454 } 00:16:57.454 ], 00:16:57.454 "driver_specific": {} 00:16:57.454 }' 00:16:57.454 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:57.454 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:57.714 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:57.973 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:57.973 "name": "BaseBdev2", 00:16:57.973 "aliases": [ 00:16:57.973 "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2" 00:16:57.973 ], 00:16:57.973 "product_name": "Malloc disk", 00:16:57.973 "block_size": 512, 00:16:57.973 "num_blocks": 65536, 00:16:57.973 "uuid": "c11e16a5-88e6-4ef1-ba51-ce8c3ffb21e2", 00:16:57.973 "assigned_rate_limits": { 00:16:57.973 "rw_ios_per_sec": 0, 00:16:57.973 "rw_mbytes_per_sec": 0, 00:16:57.973 "r_mbytes_per_sec": 0, 00:16:57.973 "w_mbytes_per_sec": 0 00:16:57.973 }, 00:16:57.973 "claimed": true, 00:16:57.973 "claim_type": "exclusive_write", 00:16:57.973 "zoned": false, 00:16:57.973 "supported_io_types": { 00:16:57.973 "read": true, 00:16:57.973 "write": true, 00:16:57.973 "unmap": true, 00:16:57.973 "flush": true, 00:16:57.973 "reset": true, 00:16:57.973 "nvme_admin": false, 00:16:57.973 "nvme_io": false, 00:16:57.973 "nvme_io_md": false, 00:16:57.973 "write_zeroes": true, 00:16:57.973 "zcopy": true, 00:16:57.973 "get_zone_info": false, 00:16:57.973 "zone_management": false, 00:16:57.973 "zone_append": false, 00:16:57.973 "compare": false, 00:16:57.973 "compare_and_write": false, 00:16:57.973 "abort": true, 00:16:57.973 "seek_hole": false, 00:16:57.973 "seek_data": false, 00:16:57.973 "copy": true, 00:16:57.973 "nvme_iov_md": false 00:16:57.973 }, 00:16:57.973 "memory_domains": [ 00:16:57.973 { 00:16:57.973 "dma_device_id": "system", 00:16:57.973 "dma_device_type": 1 00:16:57.973 }, 00:16:57.973 { 00:16:57.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.973 "dma_device_type": 2 00:16:57.973 } 00:16:57.973 ], 00:16:57.973 "driver_specific": {} 00:16:57.973 }' 00:16:57.973 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.233 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.233 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.233 09:21:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:58.233 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.492 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.492 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:58.492 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.492 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:58.492 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.751 "name": "BaseBdev3", 00:16:58.751 "aliases": [ 00:16:58.751 "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4" 00:16:58.751 ], 00:16:58.751 "product_name": "Malloc disk", 00:16:58.751 "block_size": 512, 00:16:58.751 "num_blocks": 65536, 00:16:58.751 "uuid": "ba7e9f45-973c-47ee-bcaa-9b56b7d122b4", 00:16:58.751 "assigned_rate_limits": { 00:16:58.751 "rw_ios_per_sec": 0, 00:16:58.751 "rw_mbytes_per_sec": 0, 00:16:58.751 "r_mbytes_per_sec": 0, 00:16:58.751 "w_mbytes_per_sec": 0 00:16:58.751 }, 00:16:58.751 "claimed": true, 00:16:58.751 "claim_type": "exclusive_write", 00:16:58.751 "zoned": false, 00:16:58.751 "supported_io_types": { 00:16:58.751 "read": true, 00:16:58.751 "write": true, 00:16:58.751 "unmap": true, 00:16:58.751 "flush": true, 00:16:58.751 "reset": true, 00:16:58.751 "nvme_admin": false, 00:16:58.751 "nvme_io": false, 00:16:58.751 "nvme_io_md": false, 00:16:58.751 "write_zeroes": true, 00:16:58.751 "zcopy": true, 00:16:58.751 "get_zone_info": false, 00:16:58.751 "zone_management": false, 00:16:58.751 "zone_append": false, 00:16:58.751 "compare": false, 00:16:58.751 "compare_and_write": false, 00:16:58.751 "abort": true, 00:16:58.751 "seek_hole": false, 00:16:58.751 "seek_data": false, 00:16:58.751 "copy": true, 00:16:58.751 "nvme_iov_md": false 00:16:58.751 }, 00:16:58.751 "memory_domains": [ 00:16:58.751 { 00:16:58.751 "dma_device_id": "system", 00:16:58.751 "dma_device_type": 1 00:16:58.751 }, 00:16:58.751 { 00:16:58.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.751 "dma_device_type": 2 00:16:58.751 } 00:16:58.751 ], 00:16:58.751 "driver_specific": {} 00:16:58.751 }' 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.751 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.009 09:21:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:59.266 [2024-07-15 09:21:08.051782] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:59.266 [2024-07-15 09:21:08.051812] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:59.266 [2024-07-15 09:21:08.051866] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:59.266 [2024-07-15 09:21:08.052147] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:59.266 [2024-07-15 09:21:08.052161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fdc1b0 name Existed_Raid, state offline 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 135084 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 135084 ']' 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 135084 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 135084 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 135084' 00:16:59.266 killing process with pid 135084 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 135084 00:16:59.266 [2024-07-15 09:21:08.121264] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:59.266 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 135084 00:16:59.266 [2024-07-15 09:21:08.147383] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:59.524 09:21:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:59.524 00:16:59.524 real 0m28.984s 00:16:59.524 user 0m53.169s 00:16:59.524 sys 0m5.189s 00:16:59.524 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:59.524 09:21:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:59.524 ************************************ 00:16:59.524 END TEST raid_state_function_test_sb 00:16:59.524 ************************************ 00:16:59.524 09:21:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:59.524 09:21:08 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:59.524 09:21:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:59.524 09:21:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:59.524 09:21:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:59.524 ************************************ 00:16:59.524 START TEST raid_superblock_test 00:16:59.524 ************************************ 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=140041 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 140041 /var/tmp/spdk-raid.sock 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 140041 ']' 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:59.524 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:59.524 09:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.783 [2024-07-15 09:21:08.496230] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:16:59.783 [2024-07-15 09:21:08.496294] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140041 ] 00:16:59.783 [2024-07-15 09:21:08.625645] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:59.783 [2024-07-15 09:21:08.729865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:00.041 [2024-07-15 09:21:08.797901] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:00.042 [2024-07-15 09:21:08.797967] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:00.607 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:00.865 malloc1 00:17:00.865 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:01.123 [2024-07-15 09:21:09.905431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:01.123 [2024-07-15 09:21:09.905478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.123 [2024-07-15 09:21:09.905498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a2570 00:17:01.123 [2024-07-15 09:21:09.905511] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.123 [2024-07-15 09:21:09.907142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.123 [2024-07-15 09:21:09.907169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:01.123 pt1 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:01.123 09:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:01.381 malloc2 00:17:01.381 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:01.639 [2024-07-15 09:21:10.423727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:01.639 [2024-07-15 09:21:10.423773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.639 [2024-07-15 09:21:10.423790] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a3970 00:17:01.639 [2024-07-15 09:21:10.423803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.639 [2024-07-15 09:21:10.425474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.639 [2024-07-15 09:21:10.425502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:01.639 pt2 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:01.639 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:01.897 malloc3 00:17:01.897 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:02.155 [2024-07-15 09:21:10.910893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:02.155 [2024-07-15 09:21:10.910947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:02.155 [2024-07-15 09:21:10.910966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233a340 00:17:02.155 [2024-07-15 09:21:10.910980] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:02.155 [2024-07-15 09:21:10.912560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:02.155 [2024-07-15 09:21:10.912591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:02.155 pt3 00:17:02.155 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:02.155 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:02.155 09:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:02.412 [2024-07-15 09:21:11.147535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:02.413 [2024-07-15 09:21:11.148873] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:02.413 [2024-07-15 09:21:11.148934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:02.413 [2024-07-15 09:21:11.149089] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x219aea0 00:17:02.413 [2024-07-15 09:21:11.149100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:02.413 [2024-07-15 09:21:11.149299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a2240 00:17:02.413 [2024-07-15 09:21:11.149450] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219aea0 00:17:02.413 [2024-07-15 09:21:11.149460] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219aea0 00:17:02.413 [2024-07-15 09:21:11.149557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.413 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:02.670 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.670 "name": "raid_bdev1", 00:17:02.670 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:02.670 "strip_size_kb": 0, 00:17:02.670 "state": "online", 00:17:02.670 "raid_level": "raid1", 00:17:02.670 "superblock": true, 00:17:02.670 "num_base_bdevs": 3, 00:17:02.670 "num_base_bdevs_discovered": 3, 00:17:02.670 "num_base_bdevs_operational": 3, 00:17:02.670 "base_bdevs_list": [ 00:17:02.670 { 00:17:02.670 "name": "pt1", 00:17:02.670 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.670 "is_configured": true, 00:17:02.670 "data_offset": 2048, 00:17:02.670 "data_size": 63488 00:17:02.670 }, 00:17:02.670 { 00:17:02.670 "name": "pt2", 00:17:02.670 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.670 "is_configured": true, 00:17:02.670 "data_offset": 2048, 00:17:02.670 "data_size": 63488 00:17:02.670 }, 00:17:02.670 { 00:17:02.671 "name": "pt3", 00:17:02.671 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:02.671 "is_configured": true, 00:17:02.671 "data_offset": 2048, 00:17:02.671 "data_size": 63488 00:17:02.671 } 00:17:02.671 ] 00:17:02.671 }' 00:17:02.671 09:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.671 09:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:03.245 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:03.512 [2024-07-15 09:21:12.238702] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:03.512 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:03.512 "name": "raid_bdev1", 00:17:03.512 "aliases": [ 00:17:03.512 "7af8e277-5456-4c7b-babd-78d21b7cdd29" 00:17:03.512 ], 00:17:03.512 "product_name": "Raid Volume", 00:17:03.512 "block_size": 512, 00:17:03.512 "num_blocks": 63488, 00:17:03.512 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:03.512 "assigned_rate_limits": { 00:17:03.512 "rw_ios_per_sec": 0, 00:17:03.512 "rw_mbytes_per_sec": 0, 00:17:03.512 "r_mbytes_per_sec": 0, 00:17:03.512 "w_mbytes_per_sec": 0 00:17:03.512 }, 00:17:03.512 "claimed": false, 00:17:03.512 "zoned": false, 00:17:03.512 "supported_io_types": { 00:17:03.512 "read": true, 00:17:03.512 "write": true, 00:17:03.512 "unmap": false, 00:17:03.512 "flush": false, 00:17:03.512 "reset": true, 00:17:03.512 "nvme_admin": false, 00:17:03.512 "nvme_io": false, 00:17:03.512 "nvme_io_md": false, 00:17:03.512 "write_zeroes": true, 00:17:03.512 "zcopy": false, 00:17:03.512 "get_zone_info": false, 00:17:03.512 "zone_management": false, 00:17:03.512 "zone_append": false, 00:17:03.512 "compare": false, 00:17:03.512 "compare_and_write": false, 00:17:03.512 "abort": false, 00:17:03.512 "seek_hole": false, 00:17:03.512 "seek_data": false, 00:17:03.512 "copy": false, 00:17:03.512 "nvme_iov_md": false 00:17:03.512 }, 00:17:03.512 "memory_domains": [ 00:17:03.512 { 00:17:03.512 "dma_device_id": "system", 00:17:03.512 "dma_device_type": 1 00:17:03.512 }, 00:17:03.512 { 00:17:03.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.512 "dma_device_type": 2 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "dma_device_id": "system", 00:17:03.513 "dma_device_type": 1 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.513 "dma_device_type": 2 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "dma_device_id": "system", 00:17:03.513 "dma_device_type": 1 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.513 "dma_device_type": 2 00:17:03.513 } 00:17:03.513 ], 00:17:03.513 "driver_specific": { 00:17:03.513 "raid": { 00:17:03.513 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:03.513 "strip_size_kb": 0, 00:17:03.513 "state": "online", 00:17:03.513 "raid_level": "raid1", 00:17:03.513 "superblock": true, 00:17:03.513 "num_base_bdevs": 3, 00:17:03.513 "num_base_bdevs_discovered": 3, 00:17:03.513 "num_base_bdevs_operational": 3, 00:17:03.513 "base_bdevs_list": [ 00:17:03.513 { 00:17:03.513 "name": "pt1", 00:17:03.513 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.513 "is_configured": true, 00:17:03.513 "data_offset": 2048, 00:17:03.513 "data_size": 63488 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "name": "pt2", 00:17:03.513 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:03.513 "is_configured": true, 00:17:03.513 "data_offset": 2048, 00:17:03.513 "data_size": 63488 00:17:03.513 }, 00:17:03.513 { 00:17:03.513 "name": "pt3", 00:17:03.513 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.513 "is_configured": true, 00:17:03.513 "data_offset": 2048, 00:17:03.513 "data_size": 63488 00:17:03.513 } 00:17:03.513 ] 00:17:03.513 } 00:17:03.513 } 00:17:03.513 }' 00:17:03.513 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:03.513 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:03.513 pt2 00:17:03.513 pt3' 00:17:03.513 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.513 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:03.513 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.771 "name": "pt1", 00:17:03.771 "aliases": [ 00:17:03.771 "00000000-0000-0000-0000-000000000001" 00:17:03.771 ], 00:17:03.771 "product_name": "passthru", 00:17:03.771 "block_size": 512, 00:17:03.771 "num_blocks": 65536, 00:17:03.771 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.771 "assigned_rate_limits": { 00:17:03.771 "rw_ios_per_sec": 0, 00:17:03.771 "rw_mbytes_per_sec": 0, 00:17:03.771 "r_mbytes_per_sec": 0, 00:17:03.771 "w_mbytes_per_sec": 0 00:17:03.771 }, 00:17:03.771 "claimed": true, 00:17:03.771 "claim_type": "exclusive_write", 00:17:03.771 "zoned": false, 00:17:03.771 "supported_io_types": { 00:17:03.771 "read": true, 00:17:03.771 "write": true, 00:17:03.771 "unmap": true, 00:17:03.771 "flush": true, 00:17:03.771 "reset": true, 00:17:03.771 "nvme_admin": false, 00:17:03.771 "nvme_io": false, 00:17:03.771 "nvme_io_md": false, 00:17:03.771 "write_zeroes": true, 00:17:03.771 "zcopy": true, 00:17:03.771 "get_zone_info": false, 00:17:03.771 "zone_management": false, 00:17:03.771 "zone_append": false, 00:17:03.771 "compare": false, 00:17:03.771 "compare_and_write": false, 00:17:03.771 "abort": true, 00:17:03.771 "seek_hole": false, 00:17:03.771 "seek_data": false, 00:17:03.771 "copy": true, 00:17:03.771 "nvme_iov_md": false 00:17:03.771 }, 00:17:03.771 "memory_domains": [ 00:17:03.771 { 00:17:03.771 "dma_device_id": "system", 00:17:03.771 "dma_device_type": 1 00:17:03.771 }, 00:17:03.771 { 00:17:03.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.771 "dma_device_type": 2 00:17:03.771 } 00:17:03.771 ], 00:17:03.771 "driver_specific": { 00:17:03.771 "passthru": { 00:17:03.771 "name": "pt1", 00:17:03.771 "base_bdev_name": "malloc1" 00:17:03.771 } 00:17:03.771 } 00:17:03.771 }' 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.771 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:04.029 09:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:04.287 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:04.287 "name": "pt2", 00:17:04.287 "aliases": [ 00:17:04.287 "00000000-0000-0000-0000-000000000002" 00:17:04.287 ], 00:17:04.287 "product_name": "passthru", 00:17:04.287 "block_size": 512, 00:17:04.287 "num_blocks": 65536, 00:17:04.287 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.287 "assigned_rate_limits": { 00:17:04.287 "rw_ios_per_sec": 0, 00:17:04.287 "rw_mbytes_per_sec": 0, 00:17:04.287 "r_mbytes_per_sec": 0, 00:17:04.287 "w_mbytes_per_sec": 0 00:17:04.287 }, 00:17:04.287 "claimed": true, 00:17:04.287 "claim_type": "exclusive_write", 00:17:04.287 "zoned": false, 00:17:04.287 "supported_io_types": { 00:17:04.287 "read": true, 00:17:04.287 "write": true, 00:17:04.287 "unmap": true, 00:17:04.287 "flush": true, 00:17:04.287 "reset": true, 00:17:04.287 "nvme_admin": false, 00:17:04.287 "nvme_io": false, 00:17:04.287 "nvme_io_md": false, 00:17:04.287 "write_zeroes": true, 00:17:04.287 "zcopy": true, 00:17:04.287 "get_zone_info": false, 00:17:04.287 "zone_management": false, 00:17:04.287 "zone_append": false, 00:17:04.287 "compare": false, 00:17:04.287 "compare_and_write": false, 00:17:04.287 "abort": true, 00:17:04.287 "seek_hole": false, 00:17:04.287 "seek_data": false, 00:17:04.287 "copy": true, 00:17:04.287 "nvme_iov_md": false 00:17:04.287 }, 00:17:04.287 "memory_domains": [ 00:17:04.287 { 00:17:04.287 "dma_device_id": "system", 00:17:04.287 "dma_device_type": 1 00:17:04.287 }, 00:17:04.287 { 00:17:04.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.287 "dma_device_type": 2 00:17:04.287 } 00:17:04.287 ], 00:17:04.287 "driver_specific": { 00:17:04.287 "passthru": { 00:17:04.287 "name": "pt2", 00:17:04.287 "base_bdev_name": "malloc2" 00:17:04.287 } 00:17:04.287 } 00:17:04.287 }' 00:17:04.287 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.287 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.548 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.806 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.806 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.806 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:04.806 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.064 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.064 "name": "pt3", 00:17:05.064 "aliases": [ 00:17:05.064 "00000000-0000-0000-0000-000000000003" 00:17:05.064 ], 00:17:05.064 "product_name": "passthru", 00:17:05.064 "block_size": 512, 00:17:05.064 "num_blocks": 65536, 00:17:05.064 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.064 "assigned_rate_limits": { 00:17:05.064 "rw_ios_per_sec": 0, 00:17:05.064 "rw_mbytes_per_sec": 0, 00:17:05.064 "r_mbytes_per_sec": 0, 00:17:05.064 "w_mbytes_per_sec": 0 00:17:05.064 }, 00:17:05.064 "claimed": true, 00:17:05.064 "claim_type": "exclusive_write", 00:17:05.064 "zoned": false, 00:17:05.064 "supported_io_types": { 00:17:05.064 "read": true, 00:17:05.064 "write": true, 00:17:05.064 "unmap": true, 00:17:05.064 "flush": true, 00:17:05.064 "reset": true, 00:17:05.064 "nvme_admin": false, 00:17:05.064 "nvme_io": false, 00:17:05.064 "nvme_io_md": false, 00:17:05.064 "write_zeroes": true, 00:17:05.064 "zcopy": true, 00:17:05.064 "get_zone_info": false, 00:17:05.064 "zone_management": false, 00:17:05.064 "zone_append": false, 00:17:05.064 "compare": false, 00:17:05.064 "compare_and_write": false, 00:17:05.064 "abort": true, 00:17:05.064 "seek_hole": false, 00:17:05.064 "seek_data": false, 00:17:05.064 "copy": true, 00:17:05.064 "nvme_iov_md": false 00:17:05.064 }, 00:17:05.064 "memory_domains": [ 00:17:05.064 { 00:17:05.064 "dma_device_id": "system", 00:17:05.064 "dma_device_type": 1 00:17:05.064 }, 00:17:05.064 { 00:17:05.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.064 "dma_device_type": 2 00:17:05.064 } 00:17:05.064 ], 00:17:05.064 "driver_specific": { 00:17:05.064 "passthru": { 00:17:05.064 "name": "pt3", 00:17:05.064 "base_bdev_name": "malloc3" 00:17:05.064 } 00:17:05.064 } 00:17:05.064 }' 00:17:05.064 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.064 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.064 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.065 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.065 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.065 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.065 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.065 09:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.323 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.323 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.323 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.324 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.324 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:05.324 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:05.582 [2024-07-15 09:21:14.336250] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.582 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7af8e277-5456-4c7b-babd-78d21b7cdd29 00:17:05.582 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7af8e277-5456-4c7b-babd-78d21b7cdd29 ']' 00:17:05.582 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:05.841 [2024-07-15 09:21:14.580640] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:05.841 [2024-07-15 09:21:14.580664] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:05.841 [2024-07-15 09:21:14.580711] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:05.841 [2024-07-15 09:21:14.580780] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:05.841 [2024-07-15 09:21:14.580793] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219aea0 name raid_bdev1, state offline 00:17:05.841 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.841 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:06.100 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:06.100 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:06.100 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:06.100 09:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:06.359 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:06.359 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:06.618 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:06.618 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:06.618 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:06.618 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:06.876 09:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:07.135 [2024-07-15 09:21:16.024398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:07.135 [2024-07-15 09:21:16.025787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:07.135 [2024-07-15 09:21:16.025829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:07.135 [2024-07-15 09:21:16.025874] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:07.135 [2024-07-15 09:21:16.025913] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:07.135 [2024-07-15 09:21:16.025946] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:07.135 [2024-07-15 09:21:16.025965] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:07.135 [2024-07-15 09:21:16.025975] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2345ff0 name raid_bdev1, state configuring 00:17:07.135 request: 00:17:07.135 { 00:17:07.135 "name": "raid_bdev1", 00:17:07.135 "raid_level": "raid1", 00:17:07.135 "base_bdevs": [ 00:17:07.135 "malloc1", 00:17:07.135 "malloc2", 00:17:07.135 "malloc3" 00:17:07.135 ], 00:17:07.135 "superblock": false, 00:17:07.135 "method": "bdev_raid_create", 00:17:07.135 "req_id": 1 00:17:07.135 } 00:17:07.135 Got JSON-RPC error response 00:17:07.135 response: 00:17:07.135 { 00:17:07.135 "code": -17, 00:17:07.135 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:07.135 } 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.135 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:07.394 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:07.394 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:07.394 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:07.653 [2024-07-15 09:21:16.513628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:07.653 [2024-07-15 09:21:16.513673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.653 [2024-07-15 09:21:16.513694] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a27a0 00:17:07.653 [2024-07-15 09:21:16.513706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.653 [2024-07-15 09:21:16.515337] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.653 [2024-07-15 09:21:16.515363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:07.653 [2024-07-15 09:21:16.515426] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:07.653 [2024-07-15 09:21:16.515454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:07.653 pt1 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.653 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.654 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.654 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.654 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:07.913 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.913 "name": "raid_bdev1", 00:17:07.913 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:07.913 "strip_size_kb": 0, 00:17:07.913 "state": "configuring", 00:17:07.913 "raid_level": "raid1", 00:17:07.913 "superblock": true, 00:17:07.913 "num_base_bdevs": 3, 00:17:07.913 "num_base_bdevs_discovered": 1, 00:17:07.913 "num_base_bdevs_operational": 3, 00:17:07.913 "base_bdevs_list": [ 00:17:07.913 { 00:17:07.913 "name": "pt1", 00:17:07.913 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.913 "is_configured": true, 00:17:07.913 "data_offset": 2048, 00:17:07.913 "data_size": 63488 00:17:07.913 }, 00:17:07.913 { 00:17:07.913 "name": null, 00:17:07.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.913 "is_configured": false, 00:17:07.913 "data_offset": 2048, 00:17:07.913 "data_size": 63488 00:17:07.913 }, 00:17:07.913 { 00:17:07.913 "name": null, 00:17:07.913 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.913 "is_configured": false, 00:17:07.913 "data_offset": 2048, 00:17:07.913 "data_size": 63488 00:17:07.913 } 00:17:07.913 ] 00:17:07.913 }' 00:17:07.913 09:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.913 09:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.481 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:08.481 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:08.741 [2024-07-15 09:21:17.524319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:08.741 [2024-07-15 09:21:17.524363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:08.741 [2024-07-15 09:21:17.524380] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2199a10 00:17:08.741 [2024-07-15 09:21:17.524393] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:08.741 [2024-07-15 09:21:17.524723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:08.741 [2024-07-15 09:21:17.524740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:08.741 [2024-07-15 09:21:17.524796] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:08.741 [2024-07-15 09:21:17.524815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:08.741 pt2 00:17:08.741 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:09.001 [2024-07-15 09:21:17.768988] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.001 09:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.570 09:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.570 "name": "raid_bdev1", 00:17:09.570 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:09.570 "strip_size_kb": 0, 00:17:09.570 "state": "configuring", 00:17:09.570 "raid_level": "raid1", 00:17:09.570 "superblock": true, 00:17:09.570 "num_base_bdevs": 3, 00:17:09.570 "num_base_bdevs_discovered": 1, 00:17:09.570 "num_base_bdevs_operational": 3, 00:17:09.570 "base_bdevs_list": [ 00:17:09.570 { 00:17:09.570 "name": "pt1", 00:17:09.570 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:09.570 "is_configured": true, 00:17:09.570 "data_offset": 2048, 00:17:09.570 "data_size": 63488 00:17:09.570 }, 00:17:09.570 { 00:17:09.570 "name": null, 00:17:09.570 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:09.570 "is_configured": false, 00:17:09.570 "data_offset": 2048, 00:17:09.570 "data_size": 63488 00:17:09.570 }, 00:17:09.570 { 00:17:09.570 "name": null, 00:17:09.570 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:09.570 "is_configured": false, 00:17:09.570 "data_offset": 2048, 00:17:09.570 "data_size": 63488 00:17:09.570 } 00:17:09.570 ] 00:17:09.570 }' 00:17:09.570 09:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.570 09:21:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.136 09:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:10.136 09:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:10.136 09:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:10.136 [2024-07-15 09:21:19.052373] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:10.137 [2024-07-15 09:21:19.052416] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.137 [2024-07-15 09:21:19.052436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a2a10 00:17:10.137 [2024-07-15 09:21:19.052449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.137 [2024-07-15 09:21:19.052778] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.137 [2024-07-15 09:21:19.052795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:10.137 [2024-07-15 09:21:19.052855] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:10.137 [2024-07-15 09:21:19.052874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:10.137 pt2 00:17:10.137 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:10.137 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:10.137 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:10.395 [2024-07-15 09:21:19.232849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:10.395 [2024-07-15 09:21:19.232882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.395 [2024-07-15 09:21:19.232898] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21996c0 00:17:10.395 [2024-07-15 09:21:19.232910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.395 [2024-07-15 09:21:19.233204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.395 [2024-07-15 09:21:19.233220] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:10.395 [2024-07-15 09:21:19.233270] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:10.395 [2024-07-15 09:21:19.233288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:10.395 [2024-07-15 09:21:19.233391] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x233cc00 00:17:10.395 [2024-07-15 09:21:19.233401] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:10.395 [2024-07-15 09:21:19.233563] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219c610 00:17:10.396 [2024-07-15 09:21:19.233689] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x233cc00 00:17:10.396 [2024-07-15 09:21:19.233699] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x233cc00 00:17:10.396 [2024-07-15 09:21:19.233794] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.396 pt3 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.396 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:10.654 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.654 "name": "raid_bdev1", 00:17:10.654 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:10.654 "strip_size_kb": 0, 00:17:10.654 "state": "online", 00:17:10.654 "raid_level": "raid1", 00:17:10.654 "superblock": true, 00:17:10.654 "num_base_bdevs": 3, 00:17:10.654 "num_base_bdevs_discovered": 3, 00:17:10.654 "num_base_bdevs_operational": 3, 00:17:10.654 "base_bdevs_list": [ 00:17:10.654 { 00:17:10.654 "name": "pt1", 00:17:10.654 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:10.654 "is_configured": true, 00:17:10.654 "data_offset": 2048, 00:17:10.654 "data_size": 63488 00:17:10.654 }, 00:17:10.654 { 00:17:10.654 "name": "pt2", 00:17:10.654 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:10.654 "is_configured": true, 00:17:10.654 "data_offset": 2048, 00:17:10.654 "data_size": 63488 00:17:10.654 }, 00:17:10.654 { 00:17:10.654 "name": "pt3", 00:17:10.654 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:10.654 "is_configured": true, 00:17:10.654 "data_offset": 2048, 00:17:10.655 "data_size": 63488 00:17:10.655 } 00:17:10.655 ] 00:17:10.655 }' 00:17:10.655 09:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.655 09:21:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:11.222 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:11.482 [2024-07-15 09:21:20.275902] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:11.482 "name": "raid_bdev1", 00:17:11.482 "aliases": [ 00:17:11.482 "7af8e277-5456-4c7b-babd-78d21b7cdd29" 00:17:11.482 ], 00:17:11.482 "product_name": "Raid Volume", 00:17:11.482 "block_size": 512, 00:17:11.482 "num_blocks": 63488, 00:17:11.482 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:11.482 "assigned_rate_limits": { 00:17:11.482 "rw_ios_per_sec": 0, 00:17:11.482 "rw_mbytes_per_sec": 0, 00:17:11.482 "r_mbytes_per_sec": 0, 00:17:11.482 "w_mbytes_per_sec": 0 00:17:11.482 }, 00:17:11.482 "claimed": false, 00:17:11.482 "zoned": false, 00:17:11.482 "supported_io_types": { 00:17:11.482 "read": true, 00:17:11.482 "write": true, 00:17:11.482 "unmap": false, 00:17:11.482 "flush": false, 00:17:11.482 "reset": true, 00:17:11.482 "nvme_admin": false, 00:17:11.482 "nvme_io": false, 00:17:11.482 "nvme_io_md": false, 00:17:11.482 "write_zeroes": true, 00:17:11.482 "zcopy": false, 00:17:11.482 "get_zone_info": false, 00:17:11.482 "zone_management": false, 00:17:11.482 "zone_append": false, 00:17:11.482 "compare": false, 00:17:11.482 "compare_and_write": false, 00:17:11.482 "abort": false, 00:17:11.482 "seek_hole": false, 00:17:11.482 "seek_data": false, 00:17:11.482 "copy": false, 00:17:11.482 "nvme_iov_md": false 00:17:11.482 }, 00:17:11.482 "memory_domains": [ 00:17:11.482 { 00:17:11.482 "dma_device_id": "system", 00:17:11.482 "dma_device_type": 1 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.482 "dma_device_type": 2 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "dma_device_id": "system", 00:17:11.482 "dma_device_type": 1 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.482 "dma_device_type": 2 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "dma_device_id": "system", 00:17:11.482 "dma_device_type": 1 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.482 "dma_device_type": 2 00:17:11.482 } 00:17:11.482 ], 00:17:11.482 "driver_specific": { 00:17:11.482 "raid": { 00:17:11.482 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:11.482 "strip_size_kb": 0, 00:17:11.482 "state": "online", 00:17:11.482 "raid_level": "raid1", 00:17:11.482 "superblock": true, 00:17:11.482 "num_base_bdevs": 3, 00:17:11.482 "num_base_bdevs_discovered": 3, 00:17:11.482 "num_base_bdevs_operational": 3, 00:17:11.482 "base_bdevs_list": [ 00:17:11.482 { 00:17:11.482 "name": "pt1", 00:17:11.482 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:11.482 "is_configured": true, 00:17:11.482 "data_offset": 2048, 00:17:11.482 "data_size": 63488 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "name": "pt2", 00:17:11.482 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.482 "is_configured": true, 00:17:11.482 "data_offset": 2048, 00:17:11.482 "data_size": 63488 00:17:11.482 }, 00:17:11.482 { 00:17:11.482 "name": "pt3", 00:17:11.482 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:11.482 "is_configured": true, 00:17:11.482 "data_offset": 2048, 00:17:11.482 "data_size": 63488 00:17:11.482 } 00:17:11.482 ] 00:17:11.482 } 00:17:11.482 } 00:17:11.482 }' 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:11.482 pt2 00:17:11.482 pt3' 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:11.482 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:11.742 "name": "pt1", 00:17:11.742 "aliases": [ 00:17:11.742 "00000000-0000-0000-0000-000000000001" 00:17:11.742 ], 00:17:11.742 "product_name": "passthru", 00:17:11.742 "block_size": 512, 00:17:11.742 "num_blocks": 65536, 00:17:11.742 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:11.742 "assigned_rate_limits": { 00:17:11.742 "rw_ios_per_sec": 0, 00:17:11.742 "rw_mbytes_per_sec": 0, 00:17:11.742 "r_mbytes_per_sec": 0, 00:17:11.742 "w_mbytes_per_sec": 0 00:17:11.742 }, 00:17:11.742 "claimed": true, 00:17:11.742 "claim_type": "exclusive_write", 00:17:11.742 "zoned": false, 00:17:11.742 "supported_io_types": { 00:17:11.742 "read": true, 00:17:11.742 "write": true, 00:17:11.742 "unmap": true, 00:17:11.742 "flush": true, 00:17:11.742 "reset": true, 00:17:11.742 "nvme_admin": false, 00:17:11.742 "nvme_io": false, 00:17:11.742 "nvme_io_md": false, 00:17:11.742 "write_zeroes": true, 00:17:11.742 "zcopy": true, 00:17:11.742 "get_zone_info": false, 00:17:11.742 "zone_management": false, 00:17:11.742 "zone_append": false, 00:17:11.742 "compare": false, 00:17:11.742 "compare_and_write": false, 00:17:11.742 "abort": true, 00:17:11.742 "seek_hole": false, 00:17:11.742 "seek_data": false, 00:17:11.742 "copy": true, 00:17:11.742 "nvme_iov_md": false 00:17:11.742 }, 00:17:11.742 "memory_domains": [ 00:17:11.742 { 00:17:11.742 "dma_device_id": "system", 00:17:11.742 "dma_device_type": 1 00:17:11.742 }, 00:17:11.742 { 00:17:11.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.742 "dma_device_type": 2 00:17:11.742 } 00:17:11.742 ], 00:17:11.742 "driver_specific": { 00:17:11.742 "passthru": { 00:17:11.742 "name": "pt1", 00:17:11.742 "base_bdev_name": "malloc1" 00:17:11.742 } 00:17:11.742 } 00:17:11.742 }' 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:11.742 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:12.001 09:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.261 "name": "pt2", 00:17:12.261 "aliases": [ 00:17:12.261 "00000000-0000-0000-0000-000000000002" 00:17:12.261 ], 00:17:12.261 "product_name": "passthru", 00:17:12.261 "block_size": 512, 00:17:12.261 "num_blocks": 65536, 00:17:12.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:12.261 "assigned_rate_limits": { 00:17:12.261 "rw_ios_per_sec": 0, 00:17:12.261 "rw_mbytes_per_sec": 0, 00:17:12.261 "r_mbytes_per_sec": 0, 00:17:12.261 "w_mbytes_per_sec": 0 00:17:12.261 }, 00:17:12.261 "claimed": true, 00:17:12.261 "claim_type": "exclusive_write", 00:17:12.261 "zoned": false, 00:17:12.261 "supported_io_types": { 00:17:12.261 "read": true, 00:17:12.261 "write": true, 00:17:12.261 "unmap": true, 00:17:12.261 "flush": true, 00:17:12.261 "reset": true, 00:17:12.261 "nvme_admin": false, 00:17:12.261 "nvme_io": false, 00:17:12.261 "nvme_io_md": false, 00:17:12.261 "write_zeroes": true, 00:17:12.261 "zcopy": true, 00:17:12.261 "get_zone_info": false, 00:17:12.261 "zone_management": false, 00:17:12.261 "zone_append": false, 00:17:12.261 "compare": false, 00:17:12.261 "compare_and_write": false, 00:17:12.261 "abort": true, 00:17:12.261 "seek_hole": false, 00:17:12.261 "seek_data": false, 00:17:12.261 "copy": true, 00:17:12.261 "nvme_iov_md": false 00:17:12.261 }, 00:17:12.261 "memory_domains": [ 00:17:12.261 { 00:17:12.261 "dma_device_id": "system", 00:17:12.261 "dma_device_type": 1 00:17:12.261 }, 00:17:12.261 { 00:17:12.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.261 "dma_device_type": 2 00:17:12.261 } 00:17:12.261 ], 00:17:12.261 "driver_specific": { 00:17:12.261 "passthru": { 00:17:12.261 "name": "pt2", 00:17:12.261 "base_bdev_name": "malloc2" 00:17:12.261 } 00:17:12.261 } 00:17:12.261 }' 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.261 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:12.520 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.780 "name": "pt3", 00:17:12.780 "aliases": [ 00:17:12.780 "00000000-0000-0000-0000-000000000003" 00:17:12.780 ], 00:17:12.780 "product_name": "passthru", 00:17:12.780 "block_size": 512, 00:17:12.780 "num_blocks": 65536, 00:17:12.780 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:12.780 "assigned_rate_limits": { 00:17:12.780 "rw_ios_per_sec": 0, 00:17:12.780 "rw_mbytes_per_sec": 0, 00:17:12.780 "r_mbytes_per_sec": 0, 00:17:12.780 "w_mbytes_per_sec": 0 00:17:12.780 }, 00:17:12.780 "claimed": true, 00:17:12.780 "claim_type": "exclusive_write", 00:17:12.780 "zoned": false, 00:17:12.780 "supported_io_types": { 00:17:12.780 "read": true, 00:17:12.780 "write": true, 00:17:12.780 "unmap": true, 00:17:12.780 "flush": true, 00:17:12.780 "reset": true, 00:17:12.780 "nvme_admin": false, 00:17:12.780 "nvme_io": false, 00:17:12.780 "nvme_io_md": false, 00:17:12.780 "write_zeroes": true, 00:17:12.780 "zcopy": true, 00:17:12.780 "get_zone_info": false, 00:17:12.780 "zone_management": false, 00:17:12.780 "zone_append": false, 00:17:12.780 "compare": false, 00:17:12.780 "compare_and_write": false, 00:17:12.780 "abort": true, 00:17:12.780 "seek_hole": false, 00:17:12.780 "seek_data": false, 00:17:12.780 "copy": true, 00:17:12.780 "nvme_iov_md": false 00:17:12.780 }, 00:17:12.780 "memory_domains": [ 00:17:12.780 { 00:17:12.780 "dma_device_id": "system", 00:17:12.780 "dma_device_type": 1 00:17:12.780 }, 00:17:12.780 { 00:17:12.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.780 "dma_device_type": 2 00:17:12.780 } 00:17:12.780 ], 00:17:12.780 "driver_specific": { 00:17:12.780 "passthru": { 00:17:12.780 "name": "pt3", 00:17:12.780 "base_bdev_name": "malloc3" 00:17:12.780 } 00:17:12.780 } 00:17:12.780 }' 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.780 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.039 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.039 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.039 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.039 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.039 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.040 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:13.040 09:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:13.298 [2024-07-15 09:21:22.128784] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:13.298 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7af8e277-5456-4c7b-babd-78d21b7cdd29 '!=' 7af8e277-5456-4c7b-babd-78d21b7cdd29 ']' 00:17:13.298 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:13.298 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:13.298 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:13.298 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:13.558 [2024-07-15 09:21:22.377204] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.558 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.817 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.817 "name": "raid_bdev1", 00:17:13.817 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:13.817 "strip_size_kb": 0, 00:17:13.817 "state": "online", 00:17:13.817 "raid_level": "raid1", 00:17:13.817 "superblock": true, 00:17:13.817 "num_base_bdevs": 3, 00:17:13.817 "num_base_bdevs_discovered": 2, 00:17:13.817 "num_base_bdevs_operational": 2, 00:17:13.817 "base_bdevs_list": [ 00:17:13.817 { 00:17:13.817 "name": null, 00:17:13.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.817 "is_configured": false, 00:17:13.817 "data_offset": 2048, 00:17:13.817 "data_size": 63488 00:17:13.817 }, 00:17:13.817 { 00:17:13.817 "name": "pt2", 00:17:13.817 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:13.817 "is_configured": true, 00:17:13.817 "data_offset": 2048, 00:17:13.817 "data_size": 63488 00:17:13.817 }, 00:17:13.817 { 00:17:13.817 "name": "pt3", 00:17:13.817 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:13.817 "is_configured": true, 00:17:13.817 "data_offset": 2048, 00:17:13.817 "data_size": 63488 00:17:13.817 } 00:17:13.817 ] 00:17:13.817 }' 00:17:13.817 09:21:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.817 09:21:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.384 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:14.642 [2024-07-15 09:21:23.403886] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:14.642 [2024-07-15 09:21:23.403911] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:14.642 [2024-07-15 09:21:23.403966] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:14.642 [2024-07-15 09:21:23.404019] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:14.642 [2024-07-15 09:21:23.404031] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233cc00 name raid_bdev1, state offline 00:17:14.642 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.642 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:14.901 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:14.901 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:14.901 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:14.901 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:14.901 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:15.159 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:15.159 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:15.159 09:21:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:15.418 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:15.418 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:15.418 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:15.418 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:15.418 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:15.418 [2024-07-15 09:21:24.362455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:15.418 [2024-07-15 09:21:24.362498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:15.418 [2024-07-15 09:21:24.362515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219a310 00:17:15.418 [2024-07-15 09:21:24.362527] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:15.418 [2024-07-15 09:21:24.364121] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:15.418 [2024-07-15 09:21:24.364150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:15.418 [2024-07-15 09:21:24.364213] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:15.418 [2024-07-15 09:21:24.364241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:15.418 pt2 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.677 "name": "raid_bdev1", 00:17:15.677 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:15.677 "strip_size_kb": 0, 00:17:15.677 "state": "configuring", 00:17:15.677 "raid_level": "raid1", 00:17:15.677 "superblock": true, 00:17:15.677 "num_base_bdevs": 3, 00:17:15.677 "num_base_bdevs_discovered": 1, 00:17:15.677 "num_base_bdevs_operational": 2, 00:17:15.677 "base_bdevs_list": [ 00:17:15.677 { 00:17:15.677 "name": null, 00:17:15.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.677 "is_configured": false, 00:17:15.677 "data_offset": 2048, 00:17:15.677 "data_size": 63488 00:17:15.677 }, 00:17:15.677 { 00:17:15.677 "name": "pt2", 00:17:15.677 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:15.677 "is_configured": true, 00:17:15.677 "data_offset": 2048, 00:17:15.677 "data_size": 63488 00:17:15.677 }, 00:17:15.677 { 00:17:15.677 "name": null, 00:17:15.677 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:15.677 "is_configured": false, 00:17:15.677 "data_offset": 2048, 00:17:15.677 "data_size": 63488 00:17:15.677 } 00:17:15.677 ] 00:17:15.677 }' 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.677 09:21:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.245 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:16.245 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:16.245 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:16.245 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:16.505 [2024-07-15 09:21:25.369147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:16.505 [2024-07-15 09:21:25.369192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.505 [2024-07-15 09:21:25.369211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2198ec0 00:17:16.505 [2024-07-15 09:21:25.369224] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.505 [2024-07-15 09:21:25.369546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.505 [2024-07-15 09:21:25.369562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:16.505 [2024-07-15 09:21:25.369622] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:16.505 [2024-07-15 09:21:25.369640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:16.505 [2024-07-15 09:21:25.369736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x233acc0 00:17:16.505 [2024-07-15 09:21:25.369747] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:16.505 [2024-07-15 09:21:25.369906] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233b6d0 00:17:16.505 [2024-07-15 09:21:25.370038] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x233acc0 00:17:16.505 [2024-07-15 09:21:25.370049] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x233acc0 00:17:16.505 [2024-07-15 09:21:25.370145] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.505 pt3 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.505 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.764 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.764 "name": "raid_bdev1", 00:17:16.764 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:16.764 "strip_size_kb": 0, 00:17:16.764 "state": "online", 00:17:16.764 "raid_level": "raid1", 00:17:16.764 "superblock": true, 00:17:16.764 "num_base_bdevs": 3, 00:17:16.764 "num_base_bdevs_discovered": 2, 00:17:16.764 "num_base_bdevs_operational": 2, 00:17:16.764 "base_bdevs_list": [ 00:17:16.764 { 00:17:16.764 "name": null, 00:17:16.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:16.764 "is_configured": false, 00:17:16.764 "data_offset": 2048, 00:17:16.764 "data_size": 63488 00:17:16.764 }, 00:17:16.764 { 00:17:16.764 "name": "pt2", 00:17:16.764 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:16.764 "is_configured": true, 00:17:16.764 "data_offset": 2048, 00:17:16.764 "data_size": 63488 00:17:16.764 }, 00:17:16.764 { 00:17:16.764 "name": "pt3", 00:17:16.764 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:16.764 "is_configured": true, 00:17:16.764 "data_offset": 2048, 00:17:16.764 "data_size": 63488 00:17:16.764 } 00:17:16.764 ] 00:17:16.764 }' 00:17:16.764 09:21:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.764 09:21:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.385 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:17.643 [2024-07-15 09:21:26.472076] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:17.643 [2024-07-15 09:21:26.472102] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:17.643 [2024-07-15 09:21:26.472149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:17.643 [2024-07-15 09:21:26.472202] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:17.643 [2024-07-15 09:21:26.472213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233acc0 name raid_bdev1, state offline 00:17:17.643 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.643 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:17.902 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:17.902 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:17.902 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:17.902 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:17.902 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:18.160 09:21:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:18.419 [2024-07-15 09:21:27.210006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:18.419 [2024-07-15 09:21:27.210046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.419 [2024-07-15 09:21:27.210062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2198ec0 00:17:18.419 [2024-07-15 09:21:27.210074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.419 [2024-07-15 09:21:27.211653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.419 [2024-07-15 09:21:27.211680] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:18.419 [2024-07-15 09:21:27.211740] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:18.419 [2024-07-15 09:21:27.211767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:18.419 [2024-07-15 09:21:27.211860] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:18.419 [2024-07-15 09:21:27.211873] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:18.419 [2024-07-15 09:21:27.211887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233af40 name raid_bdev1, state configuring 00:17:18.419 [2024-07-15 09:21:27.211910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:18.419 pt1 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.419 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:18.678 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.678 "name": "raid_bdev1", 00:17:18.678 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:18.678 "strip_size_kb": 0, 00:17:18.678 "state": "configuring", 00:17:18.678 "raid_level": "raid1", 00:17:18.678 "superblock": true, 00:17:18.678 "num_base_bdevs": 3, 00:17:18.678 "num_base_bdevs_discovered": 1, 00:17:18.678 "num_base_bdevs_operational": 2, 00:17:18.678 "base_bdevs_list": [ 00:17:18.678 { 00:17:18.678 "name": null, 00:17:18.678 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.678 "is_configured": false, 00:17:18.678 "data_offset": 2048, 00:17:18.678 "data_size": 63488 00:17:18.678 }, 00:17:18.678 { 00:17:18.678 "name": "pt2", 00:17:18.678 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:18.678 "is_configured": true, 00:17:18.678 "data_offset": 2048, 00:17:18.678 "data_size": 63488 00:17:18.678 }, 00:17:18.678 { 00:17:18.678 "name": null, 00:17:18.678 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:18.678 "is_configured": false, 00:17:18.678 "data_offset": 2048, 00:17:18.678 "data_size": 63488 00:17:18.678 } 00:17:18.678 ] 00:17:18.678 }' 00:17:18.678 09:21:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.678 09:21:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.612 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:19.612 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:19.612 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:19.612 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:19.870 [2024-07-15 09:21:28.746109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:19.870 [2024-07-15 09:21:28.746164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.870 [2024-07-15 09:21:28.746184] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219c0c0 00:17:19.870 [2024-07-15 09:21:28.746197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.870 [2024-07-15 09:21:28.746540] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.870 [2024-07-15 09:21:28.746557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:19.870 [2024-07-15 09:21:28.746623] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:19.870 [2024-07-15 09:21:28.746643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:19.870 [2024-07-15 09:21:28.746745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x219ca40 00:17:19.870 [2024-07-15 09:21:28.746756] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:19.870 [2024-07-15 09:21:28.746920] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233b6c0 00:17:19.870 [2024-07-15 09:21:28.747068] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219ca40 00:17:19.870 [2024-07-15 09:21:28.747079] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219ca40 00:17:19.870 [2024-07-15 09:21:28.747177] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.870 pt3 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.870 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.871 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.871 09:21:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:20.129 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.129 "name": "raid_bdev1", 00:17:20.129 "uuid": "7af8e277-5456-4c7b-babd-78d21b7cdd29", 00:17:20.129 "strip_size_kb": 0, 00:17:20.129 "state": "online", 00:17:20.129 "raid_level": "raid1", 00:17:20.129 "superblock": true, 00:17:20.129 "num_base_bdevs": 3, 00:17:20.129 "num_base_bdevs_discovered": 2, 00:17:20.129 "num_base_bdevs_operational": 2, 00:17:20.129 "base_bdevs_list": [ 00:17:20.129 { 00:17:20.129 "name": null, 00:17:20.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.129 "is_configured": false, 00:17:20.129 "data_offset": 2048, 00:17:20.129 "data_size": 63488 00:17:20.129 }, 00:17:20.129 { 00:17:20.129 "name": "pt2", 00:17:20.129 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:20.129 "is_configured": true, 00:17:20.129 "data_offset": 2048, 00:17:20.129 "data_size": 63488 00:17:20.129 }, 00:17:20.129 { 00:17:20.129 "name": "pt3", 00:17:20.129 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:20.129 "is_configured": true, 00:17:20.129 "data_offset": 2048, 00:17:20.129 "data_size": 63488 00:17:20.129 } 00:17:20.129 ] 00:17:20.129 }' 00:17:20.129 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.129 09:21:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.695 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:20.695 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:20.953 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:20.953 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:20.953 09:21:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:21.212 [2024-07-15 09:21:30.001677] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 7af8e277-5456-4c7b-babd-78d21b7cdd29 '!=' 7af8e277-5456-4c7b-babd-78d21b7cdd29 ']' 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 140041 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 140041 ']' 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 140041 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 140041 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 140041' 00:17:21.212 killing process with pid 140041 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 140041 00:17:21.212 [2024-07-15 09:21:30.070436] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:21.212 [2024-07-15 09:21:30.070500] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:21.212 [2024-07-15 09:21:30.070553] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:21.212 [2024-07-15 09:21:30.070568] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219ca40 name raid_bdev1, state offline 00:17:21.212 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 140041 00:17:21.212 [2024-07-15 09:21:30.100705] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:21.471 09:21:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:21.471 00:17:21.471 real 0m21.889s 00:17:21.471 user 0m39.969s 00:17:21.471 sys 0m3.939s 00:17:21.471 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:21.471 09:21:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.471 ************************************ 00:17:21.471 END TEST raid_superblock_test 00:17:21.471 ************************************ 00:17:21.471 09:21:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:21.471 09:21:30 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:21.471 09:21:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:21.471 09:21:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:21.471 09:21:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:21.471 ************************************ 00:17:21.471 START TEST raid_read_error_test 00:17:21.471 ************************************ 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:21.471 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.npVC7DWPqd 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=143314 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 143314 /var/tmp/spdk-raid.sock 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 143314 ']' 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:21.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:21.472 09:21:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.731 [2024-07-15 09:21:30.481172] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:17:21.731 [2024-07-15 09:21:30.481239] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid143314 ] 00:17:21.731 [2024-07-15 09:21:30.606768] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:21.990 [2024-07-15 09:21:30.710428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:21.990 [2024-07-15 09:21:30.771399] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:21.990 [2024-07-15 09:21:30.771436] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:22.928 09:21:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:22.928 09:21:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:22.928 09:21:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:22.928 09:21:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:23.187 BaseBdev1_malloc 00:17:23.187 09:21:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:23.446 true 00:17:23.446 09:21:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:23.446 [2024-07-15 09:21:32.392318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:23.446 [2024-07-15 09:21:32.392363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:23.446 [2024-07-15 09:21:32.392384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e4e0d0 00:17:23.446 [2024-07-15 09:21:32.392398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:23.446 [2024-07-15 09:21:32.394298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:23.446 [2024-07-15 09:21:32.394330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:23.446 BaseBdev1 00:17:23.705 09:21:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:23.705 09:21:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:23.705 BaseBdev2_malloc 00:17:23.964 09:21:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:23.964 true 00:17:23.964 09:21:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:24.223 [2024-07-15 09:21:33.128135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:24.223 [2024-07-15 09:21:33.128180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:24.223 [2024-07-15 09:21:33.128200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e52910 00:17:24.223 [2024-07-15 09:21:33.128213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:24.223 [2024-07-15 09:21:33.129788] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:24.223 [2024-07-15 09:21:33.129818] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:24.223 BaseBdev2 00:17:24.223 09:21:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:24.223 09:21:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:24.483 BaseBdev3_malloc 00:17:24.483 09:21:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:24.742 true 00:17:24.742 09:21:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:25.000 [2024-07-15 09:21:33.855555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:25.000 [2024-07-15 09:21:33.855602] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.000 [2024-07-15 09:21:33.855622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e54bd0 00:17:25.000 [2024-07-15 09:21:33.855635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.000 [2024-07-15 09:21:33.857190] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.000 [2024-07-15 09:21:33.857218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:25.000 BaseBdev3 00:17:25.000 09:21:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:25.259 [2024-07-15 09:21:34.100231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:25.259 [2024-07-15 09:21:34.101605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:25.259 [2024-07-15 09:21:34.101674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:25.259 [2024-07-15 09:21:34.101885] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e56280 00:17:25.259 [2024-07-15 09:21:34.101898] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:25.259 [2024-07-15 09:21:34.102105] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e55e20 00:17:25.259 [2024-07-15 09:21:34.102262] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e56280 00:17:25.259 [2024-07-15 09:21:34.102273] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e56280 00:17:25.259 [2024-07-15 09:21:34.102380] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.259 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:25.517 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.517 "name": "raid_bdev1", 00:17:25.517 "uuid": "bf7cf92e-aa8b-4415-9637-398cb5deb565", 00:17:25.517 "strip_size_kb": 0, 00:17:25.517 "state": "online", 00:17:25.517 "raid_level": "raid1", 00:17:25.517 "superblock": true, 00:17:25.517 "num_base_bdevs": 3, 00:17:25.517 "num_base_bdevs_discovered": 3, 00:17:25.517 "num_base_bdevs_operational": 3, 00:17:25.517 "base_bdevs_list": [ 00:17:25.517 { 00:17:25.517 "name": "BaseBdev1", 00:17:25.517 "uuid": "50e2ca05-9a59-577d-ad7c-a7e4cfc35ac1", 00:17:25.517 "is_configured": true, 00:17:25.517 "data_offset": 2048, 00:17:25.517 "data_size": 63488 00:17:25.517 }, 00:17:25.517 { 00:17:25.517 "name": "BaseBdev2", 00:17:25.517 "uuid": "0c4855bf-1158-55b8-8cf3-4f377ad42e04", 00:17:25.517 "is_configured": true, 00:17:25.517 "data_offset": 2048, 00:17:25.517 "data_size": 63488 00:17:25.517 }, 00:17:25.517 { 00:17:25.517 "name": "BaseBdev3", 00:17:25.517 "uuid": "835397c5-7fec-5d4b-997a-7814dad636b3", 00:17:25.517 "is_configured": true, 00:17:25.517 "data_offset": 2048, 00:17:25.517 "data_size": 63488 00:17:25.517 } 00:17:25.517 ] 00:17:25.517 }' 00:17:25.517 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.517 09:21:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.083 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:26.083 09:21:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:26.341 [2024-07-15 09:21:35.051037] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca3e00 00:17:27.324 09:21:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.324 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:27.583 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.583 "name": "raid_bdev1", 00:17:27.583 "uuid": "bf7cf92e-aa8b-4415-9637-398cb5deb565", 00:17:27.583 "strip_size_kb": 0, 00:17:27.583 "state": "online", 00:17:27.583 "raid_level": "raid1", 00:17:27.583 "superblock": true, 00:17:27.583 "num_base_bdevs": 3, 00:17:27.583 "num_base_bdevs_discovered": 3, 00:17:27.583 "num_base_bdevs_operational": 3, 00:17:27.583 "base_bdevs_list": [ 00:17:27.583 { 00:17:27.583 "name": "BaseBdev1", 00:17:27.583 "uuid": "50e2ca05-9a59-577d-ad7c-a7e4cfc35ac1", 00:17:27.583 "is_configured": true, 00:17:27.583 "data_offset": 2048, 00:17:27.583 "data_size": 63488 00:17:27.583 }, 00:17:27.583 { 00:17:27.583 "name": "BaseBdev2", 00:17:27.583 "uuid": "0c4855bf-1158-55b8-8cf3-4f377ad42e04", 00:17:27.583 "is_configured": true, 00:17:27.583 "data_offset": 2048, 00:17:27.583 "data_size": 63488 00:17:27.583 }, 00:17:27.583 { 00:17:27.583 "name": "BaseBdev3", 00:17:27.583 "uuid": "835397c5-7fec-5d4b-997a-7814dad636b3", 00:17:27.583 "is_configured": true, 00:17:27.583 "data_offset": 2048, 00:17:27.583 "data_size": 63488 00:17:27.583 } 00:17:27.583 ] 00:17:27.583 }' 00:17:27.583 09:21:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.583 09:21:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.149 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:28.407 [2024-07-15 09:21:37.286275] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:28.407 [2024-07-15 09:21:37.286311] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:28.407 [2024-07-15 09:21:37.289522] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:28.407 [2024-07-15 09:21:37.289572] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:28.407 [2024-07-15 09:21:37.289670] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:28.407 [2024-07-15 09:21:37.289682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e56280 name raid_bdev1, state offline 00:17:28.407 0 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 143314 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 143314 ']' 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 143314 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 143314 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 143314' 00:17:28.407 killing process with pid 143314 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 143314 00:17:28.407 [2024-07-15 09:21:37.352549] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:28.407 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 143314 00:17:28.666 [2024-07-15 09:21:37.374305] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.npVC7DWPqd 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:28.666 00:17:28.666 real 0m7.208s 00:17:28.666 user 0m11.537s 00:17:28.666 sys 0m1.210s 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:28.666 09:21:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.666 ************************************ 00:17:28.666 END TEST raid_read_error_test 00:17:28.666 ************************************ 00:17:28.925 09:21:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:28.925 09:21:37 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:28.925 09:21:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:28.925 09:21:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:28.925 09:21:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:28.925 ************************************ 00:17:28.925 START TEST raid_write_error_test 00:17:28.925 ************************************ 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RFzrEGmCQZ 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=144312 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 144312 /var/tmp/spdk-raid.sock 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 144312 ']' 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:28.925 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:28.925 09:21:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.925 [2024-07-15 09:21:37.778227] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:17:28.925 [2024-07-15 09:21:37.778301] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144312 ] 00:17:29.184 [2024-07-15 09:21:37.910095] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:29.184 [2024-07-15 09:21:38.008469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:29.184 [2024-07-15 09:21:38.065566] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:29.184 [2024-07-15 09:21:38.065595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:30.122 09:21:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:30.122 09:21:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:30.122 09:21:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:30.122 09:21:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:30.122 BaseBdev1_malloc 00:17:30.122 09:21:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:30.380 true 00:17:30.381 09:21:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:30.640 [2024-07-15 09:21:39.434301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:30.640 [2024-07-15 09:21:39.434348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:30.640 [2024-07-15 09:21:39.434369] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c30d0 00:17:30.640 [2024-07-15 09:21:39.434382] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:30.640 [2024-07-15 09:21:39.436139] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:30.640 [2024-07-15 09:21:39.436168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:30.640 BaseBdev1 00:17:30.640 09:21:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:30.640 09:21:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:30.899 BaseBdev2_malloc 00:17:30.899 09:21:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:31.158 true 00:17:31.158 09:21:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:31.417 [2024-07-15 09:21:40.156826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:31.417 [2024-07-15 09:21:40.156873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:31.417 [2024-07-15 09:21:40.156893] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c7910 00:17:31.417 [2024-07-15 09:21:40.156906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:31.417 [2024-07-15 09:21:40.158464] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:31.417 [2024-07-15 09:21:40.158493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:31.417 BaseBdev2 00:17:31.417 09:21:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:31.417 09:21:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:31.676 BaseBdev3_malloc 00:17:31.676 09:21:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:31.969 true 00:17:31.969 09:21:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:31.969 [2024-07-15 09:21:40.895334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:31.969 [2024-07-15 09:21:40.895382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:31.969 [2024-07-15 09:21:40.895401] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c9bd0 00:17:31.969 [2024-07-15 09:21:40.895414] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:31.969 [2024-07-15 09:21:40.896871] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:31.969 [2024-07-15 09:21:40.896900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:31.969 BaseBdev3 00:17:32.229 09:21:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:32.229 [2024-07-15 09:21:41.140008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:32.229 [2024-07-15 09:21:41.141223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:32.229 [2024-07-15 09:21:41.141289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.229 [2024-07-15 09:21:41.141492] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11cb280 00:17:32.229 [2024-07-15 09:21:41.141503] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:32.229 [2024-07-15 09:21:41.141689] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11cae20 00:17:32.229 [2024-07-15 09:21:41.141836] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11cb280 00:17:32.229 [2024-07-15 09:21:41.141847] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11cb280 00:17:32.229 [2024-07-15 09:21:41.141953] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:32.229 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.488 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.488 "name": "raid_bdev1", 00:17:32.488 "uuid": "67abb0f7-c093-4dca-8e44-bd8c41b251ae", 00:17:32.488 "strip_size_kb": 0, 00:17:32.488 "state": "online", 00:17:32.488 "raid_level": "raid1", 00:17:32.488 "superblock": true, 00:17:32.488 "num_base_bdevs": 3, 00:17:32.488 "num_base_bdevs_discovered": 3, 00:17:32.488 "num_base_bdevs_operational": 3, 00:17:32.488 "base_bdevs_list": [ 00:17:32.488 { 00:17:32.488 "name": "BaseBdev1", 00:17:32.488 "uuid": "02bbff58-faa0-51be-b368-f863486fb6de", 00:17:32.488 "is_configured": true, 00:17:32.488 "data_offset": 2048, 00:17:32.488 "data_size": 63488 00:17:32.488 }, 00:17:32.488 { 00:17:32.488 "name": "BaseBdev2", 00:17:32.488 "uuid": "46288701-6b50-5732-b336-9d2a6a59f17c", 00:17:32.488 "is_configured": true, 00:17:32.488 "data_offset": 2048, 00:17:32.488 "data_size": 63488 00:17:32.488 }, 00:17:32.488 { 00:17:32.488 "name": "BaseBdev3", 00:17:32.488 "uuid": "50a24101-0750-5095-8f34-31f0430f9fd7", 00:17:32.488 "is_configured": true, 00:17:32.488 "data_offset": 2048, 00:17:32.488 "data_size": 63488 00:17:32.488 } 00:17:32.488 ] 00:17:32.488 }' 00:17:32.488 09:21:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.488 09:21:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.425 09:21:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:33.425 09:21:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:33.425 [2024-07-15 09:21:42.118900] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1018e00 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:34.364 [2024-07-15 09:21:43.239451] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:34.364 [2024-07-15 09:21:43.239512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:34.364 [2024-07-15 09:21:43.239707] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1018e00 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:34.364 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.365 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:34.623 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.623 "name": "raid_bdev1", 00:17:34.623 "uuid": "67abb0f7-c093-4dca-8e44-bd8c41b251ae", 00:17:34.623 "strip_size_kb": 0, 00:17:34.623 "state": "online", 00:17:34.623 "raid_level": "raid1", 00:17:34.623 "superblock": true, 00:17:34.623 "num_base_bdevs": 3, 00:17:34.623 "num_base_bdevs_discovered": 2, 00:17:34.623 "num_base_bdevs_operational": 2, 00:17:34.623 "base_bdevs_list": [ 00:17:34.623 { 00:17:34.623 "name": null, 00:17:34.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.623 "is_configured": false, 00:17:34.623 "data_offset": 2048, 00:17:34.623 "data_size": 63488 00:17:34.623 }, 00:17:34.623 { 00:17:34.623 "name": "BaseBdev2", 00:17:34.623 "uuid": "46288701-6b50-5732-b336-9d2a6a59f17c", 00:17:34.623 "is_configured": true, 00:17:34.623 "data_offset": 2048, 00:17:34.623 "data_size": 63488 00:17:34.623 }, 00:17:34.623 { 00:17:34.623 "name": "BaseBdev3", 00:17:34.623 "uuid": "50a24101-0750-5095-8f34-31f0430f9fd7", 00:17:34.623 "is_configured": true, 00:17:34.623 "data_offset": 2048, 00:17:34.623 "data_size": 63488 00:17:34.623 } 00:17:34.623 ] 00:17:34.623 }' 00:17:34.623 09:21:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.623 09:21:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.191 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:35.451 [2024-07-15 09:21:44.310671] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:35.451 [2024-07-15 09:21:44.310716] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:35.451 [2024-07-15 09:21:44.313851] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:35.451 [2024-07-15 09:21:44.313882] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:35.451 [2024-07-15 09:21:44.313959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:35.451 [2024-07-15 09:21:44.313971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11cb280 name raid_bdev1, state offline 00:17:35.451 0 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 144312 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 144312 ']' 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 144312 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 144312 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 144312' 00:17:35.451 killing process with pid 144312 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 144312 00:17:35.451 [2024-07-15 09:21:44.377568] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:35.451 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 144312 00:17:35.451 [2024-07-15 09:21:44.398368] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RFzrEGmCQZ 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:35.711 00:17:35.711 real 0m6.939s 00:17:35.711 user 0m11.001s 00:17:35.711 sys 0m1.210s 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:35.711 09:21:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.711 ************************************ 00:17:35.711 END TEST raid_write_error_test 00:17:35.711 ************************************ 00:17:35.970 09:21:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:35.971 09:21:44 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:35.971 09:21:44 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:35.971 09:21:44 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:35.971 09:21:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:35.971 09:21:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:35.971 09:21:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:35.971 ************************************ 00:17:35.971 START TEST raid_state_function_test 00:17:35.971 ************************************ 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=145408 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 145408' 00:17:35.971 Process raid pid: 145408 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 145408 /var/tmp/spdk-raid.sock 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 145408 ']' 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:35.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:35.971 09:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.971 [2024-07-15 09:21:44.795076] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:17:35.971 [2024-07-15 09:21:44.795143] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:36.230 [2024-07-15 09:21:44.923844] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:36.230 [2024-07-15 09:21:45.029667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:36.230 [2024-07-15 09:21:45.091263] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.230 [2024-07-15 09:21:45.091317] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.797 09:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:36.797 09:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:36.797 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:37.056 [2024-07-15 09:21:45.959350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.056 [2024-07-15 09:21:45.959392] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.056 [2024-07-15 09:21:45.959403] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:37.056 [2024-07-15 09:21:45.959415] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:37.056 [2024-07-15 09:21:45.959424] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:37.056 [2024-07-15 09:21:45.959435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:37.056 [2024-07-15 09:21:45.959444] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:37.056 [2024-07-15 09:21:45.959455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.056 09:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.315 09:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.315 "name": "Existed_Raid", 00:17:37.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.315 "strip_size_kb": 64, 00:17:37.315 "state": "configuring", 00:17:37.315 "raid_level": "raid0", 00:17:37.315 "superblock": false, 00:17:37.315 "num_base_bdevs": 4, 00:17:37.315 "num_base_bdevs_discovered": 0, 00:17:37.315 "num_base_bdevs_operational": 4, 00:17:37.315 "base_bdevs_list": [ 00:17:37.315 { 00:17:37.315 "name": "BaseBdev1", 00:17:37.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.315 "is_configured": false, 00:17:37.315 "data_offset": 0, 00:17:37.315 "data_size": 0 00:17:37.316 }, 00:17:37.316 { 00:17:37.316 "name": "BaseBdev2", 00:17:37.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.316 "is_configured": false, 00:17:37.316 "data_offset": 0, 00:17:37.316 "data_size": 0 00:17:37.316 }, 00:17:37.316 { 00:17:37.316 "name": "BaseBdev3", 00:17:37.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.316 "is_configured": false, 00:17:37.316 "data_offset": 0, 00:17:37.316 "data_size": 0 00:17:37.316 }, 00:17:37.316 { 00:17:37.316 "name": "BaseBdev4", 00:17:37.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.316 "is_configured": false, 00:17:37.316 "data_offset": 0, 00:17:37.316 "data_size": 0 00:17:37.316 } 00:17:37.316 ] 00:17:37.316 }' 00:17:37.316 09:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.316 09:21:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.253 09:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:38.253 [2024-07-15 09:21:47.058113] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:38.253 [2024-07-15 09:21:47.058142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e18aa0 name Existed_Raid, state configuring 00:17:38.253 09:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:38.511 [2024-07-15 09:21:47.298765] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:38.511 [2024-07-15 09:21:47.298791] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:38.511 [2024-07-15 09:21:47.298801] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:38.511 [2024-07-15 09:21:47.298812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:38.511 [2024-07-15 09:21:47.298820] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:38.511 [2024-07-15 09:21:47.298832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:38.511 [2024-07-15 09:21:47.298840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:38.511 [2024-07-15 09:21:47.298851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:38.511 09:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:38.770 [2024-07-15 09:21:47.553230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:38.770 BaseBdev1 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:38.770 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.028 09:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:39.287 [ 00:17:39.287 { 00:17:39.287 "name": "BaseBdev1", 00:17:39.287 "aliases": [ 00:17:39.287 "a1c15a68-b66c-479f-972b-0bd6366f16ec" 00:17:39.287 ], 00:17:39.287 "product_name": "Malloc disk", 00:17:39.287 "block_size": 512, 00:17:39.287 "num_blocks": 65536, 00:17:39.287 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:39.287 "assigned_rate_limits": { 00:17:39.287 "rw_ios_per_sec": 0, 00:17:39.287 "rw_mbytes_per_sec": 0, 00:17:39.287 "r_mbytes_per_sec": 0, 00:17:39.287 "w_mbytes_per_sec": 0 00:17:39.287 }, 00:17:39.287 "claimed": true, 00:17:39.287 "claim_type": "exclusive_write", 00:17:39.287 "zoned": false, 00:17:39.287 "supported_io_types": { 00:17:39.287 "read": true, 00:17:39.287 "write": true, 00:17:39.287 "unmap": true, 00:17:39.287 "flush": true, 00:17:39.287 "reset": true, 00:17:39.287 "nvme_admin": false, 00:17:39.287 "nvme_io": false, 00:17:39.287 "nvme_io_md": false, 00:17:39.287 "write_zeroes": true, 00:17:39.287 "zcopy": true, 00:17:39.287 "get_zone_info": false, 00:17:39.287 "zone_management": false, 00:17:39.287 "zone_append": false, 00:17:39.287 "compare": false, 00:17:39.287 "compare_and_write": false, 00:17:39.287 "abort": true, 00:17:39.287 "seek_hole": false, 00:17:39.287 "seek_data": false, 00:17:39.287 "copy": true, 00:17:39.287 "nvme_iov_md": false 00:17:39.287 }, 00:17:39.288 "memory_domains": [ 00:17:39.288 { 00:17:39.288 "dma_device_id": "system", 00:17:39.288 "dma_device_type": 1 00:17:39.288 }, 00:17:39.288 { 00:17:39.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.288 "dma_device_type": 2 00:17:39.288 } 00:17:39.288 ], 00:17:39.288 "driver_specific": {} 00:17:39.288 } 00:17:39.288 ] 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.288 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.547 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.547 "name": "Existed_Raid", 00:17:39.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.547 "strip_size_kb": 64, 00:17:39.547 "state": "configuring", 00:17:39.547 "raid_level": "raid0", 00:17:39.547 "superblock": false, 00:17:39.547 "num_base_bdevs": 4, 00:17:39.547 "num_base_bdevs_discovered": 1, 00:17:39.547 "num_base_bdevs_operational": 4, 00:17:39.547 "base_bdevs_list": [ 00:17:39.547 { 00:17:39.547 "name": "BaseBdev1", 00:17:39.547 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:39.547 "is_configured": true, 00:17:39.547 "data_offset": 0, 00:17:39.547 "data_size": 65536 00:17:39.547 }, 00:17:39.547 { 00:17:39.547 "name": "BaseBdev2", 00:17:39.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.547 "is_configured": false, 00:17:39.547 "data_offset": 0, 00:17:39.547 "data_size": 0 00:17:39.547 }, 00:17:39.547 { 00:17:39.547 "name": "BaseBdev3", 00:17:39.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.547 "is_configured": false, 00:17:39.547 "data_offset": 0, 00:17:39.547 "data_size": 0 00:17:39.547 }, 00:17:39.547 { 00:17:39.547 "name": "BaseBdev4", 00:17:39.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.547 "is_configured": false, 00:17:39.547 "data_offset": 0, 00:17:39.547 "data_size": 0 00:17:39.547 } 00:17:39.547 ] 00:17:39.547 }' 00:17:39.547 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.547 09:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.114 09:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:40.374 [2024-07-15 09:21:49.161486] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:40.374 [2024-07-15 09:21:49.161526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e18310 name Existed_Raid, state configuring 00:17:40.374 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:40.633 [2024-07-15 09:21:49.402167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:40.633 [2024-07-15 09:21:49.403709] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:40.633 [2024-07-15 09:21:49.403742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:40.633 [2024-07-15 09:21:49.403752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:40.633 [2024-07-15 09:21:49.403764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:40.633 [2024-07-15 09:21:49.403773] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:40.633 [2024-07-15 09:21:49.403784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.633 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.893 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.893 "name": "Existed_Raid", 00:17:40.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.893 "strip_size_kb": 64, 00:17:40.893 "state": "configuring", 00:17:40.893 "raid_level": "raid0", 00:17:40.893 "superblock": false, 00:17:40.893 "num_base_bdevs": 4, 00:17:40.893 "num_base_bdevs_discovered": 1, 00:17:40.893 "num_base_bdevs_operational": 4, 00:17:40.893 "base_bdevs_list": [ 00:17:40.893 { 00:17:40.893 "name": "BaseBdev1", 00:17:40.893 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:40.893 "is_configured": true, 00:17:40.893 "data_offset": 0, 00:17:40.893 "data_size": 65536 00:17:40.893 }, 00:17:40.893 { 00:17:40.893 "name": "BaseBdev2", 00:17:40.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.893 "is_configured": false, 00:17:40.893 "data_offset": 0, 00:17:40.893 "data_size": 0 00:17:40.893 }, 00:17:40.893 { 00:17:40.893 "name": "BaseBdev3", 00:17:40.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.893 "is_configured": false, 00:17:40.893 "data_offset": 0, 00:17:40.893 "data_size": 0 00:17:40.893 }, 00:17:40.893 { 00:17:40.893 "name": "BaseBdev4", 00:17:40.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.893 "is_configured": false, 00:17:40.893 "data_offset": 0, 00:17:40.893 "data_size": 0 00:17:40.893 } 00:17:40.893 ] 00:17:40.893 }' 00:17:40.893 09:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.893 09:21:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.460 09:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:41.718 [2024-07-15 09:21:50.508526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:41.718 BaseBdev2 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:41.718 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.977 09:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:42.235 [ 00:17:42.235 { 00:17:42.235 "name": "BaseBdev2", 00:17:42.235 "aliases": [ 00:17:42.235 "3aca3d18-69a4-4154-aa52-6b1f1fc0107d" 00:17:42.235 ], 00:17:42.235 "product_name": "Malloc disk", 00:17:42.235 "block_size": 512, 00:17:42.235 "num_blocks": 65536, 00:17:42.235 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:42.235 "assigned_rate_limits": { 00:17:42.235 "rw_ios_per_sec": 0, 00:17:42.235 "rw_mbytes_per_sec": 0, 00:17:42.235 "r_mbytes_per_sec": 0, 00:17:42.235 "w_mbytes_per_sec": 0 00:17:42.235 }, 00:17:42.235 "claimed": true, 00:17:42.235 "claim_type": "exclusive_write", 00:17:42.235 "zoned": false, 00:17:42.235 "supported_io_types": { 00:17:42.235 "read": true, 00:17:42.235 "write": true, 00:17:42.235 "unmap": true, 00:17:42.235 "flush": true, 00:17:42.235 "reset": true, 00:17:42.235 "nvme_admin": false, 00:17:42.235 "nvme_io": false, 00:17:42.235 "nvme_io_md": false, 00:17:42.235 "write_zeroes": true, 00:17:42.235 "zcopy": true, 00:17:42.235 "get_zone_info": false, 00:17:42.235 "zone_management": false, 00:17:42.235 "zone_append": false, 00:17:42.235 "compare": false, 00:17:42.235 "compare_and_write": false, 00:17:42.235 "abort": true, 00:17:42.235 "seek_hole": false, 00:17:42.235 "seek_data": false, 00:17:42.235 "copy": true, 00:17:42.235 "nvme_iov_md": false 00:17:42.235 }, 00:17:42.235 "memory_domains": [ 00:17:42.235 { 00:17:42.235 "dma_device_id": "system", 00:17:42.235 "dma_device_type": 1 00:17:42.235 }, 00:17:42.235 { 00:17:42.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.235 "dma_device_type": 2 00:17:42.235 } 00:17:42.235 ], 00:17:42.235 "driver_specific": {} 00:17:42.235 } 00:17:42.235 ] 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.235 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.493 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.493 "name": "Existed_Raid", 00:17:42.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.493 "strip_size_kb": 64, 00:17:42.493 "state": "configuring", 00:17:42.493 "raid_level": "raid0", 00:17:42.493 "superblock": false, 00:17:42.493 "num_base_bdevs": 4, 00:17:42.493 "num_base_bdevs_discovered": 2, 00:17:42.493 "num_base_bdevs_operational": 4, 00:17:42.493 "base_bdevs_list": [ 00:17:42.493 { 00:17:42.493 "name": "BaseBdev1", 00:17:42.493 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:42.493 "is_configured": true, 00:17:42.493 "data_offset": 0, 00:17:42.493 "data_size": 65536 00:17:42.493 }, 00:17:42.493 { 00:17:42.493 "name": "BaseBdev2", 00:17:42.493 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:42.493 "is_configured": true, 00:17:42.493 "data_offset": 0, 00:17:42.493 "data_size": 65536 00:17:42.493 }, 00:17:42.493 { 00:17:42.493 "name": "BaseBdev3", 00:17:42.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.493 "is_configured": false, 00:17:42.493 "data_offset": 0, 00:17:42.493 "data_size": 0 00:17:42.493 }, 00:17:42.493 { 00:17:42.493 "name": "BaseBdev4", 00:17:42.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.493 "is_configured": false, 00:17:42.493 "data_offset": 0, 00:17:42.493 "data_size": 0 00:17:42.493 } 00:17:42.493 ] 00:17:42.493 }' 00:17:42.493 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.493 09:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.060 09:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:43.317 [2024-07-15 09:21:52.108139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:43.317 BaseBdev3 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.317 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.575 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:43.833 [ 00:17:43.833 { 00:17:43.833 "name": "BaseBdev3", 00:17:43.833 "aliases": [ 00:17:43.833 "e118a769-415e-414a-9b4c-42222e15b1bf" 00:17:43.833 ], 00:17:43.833 "product_name": "Malloc disk", 00:17:43.833 "block_size": 512, 00:17:43.833 "num_blocks": 65536, 00:17:43.833 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:43.833 "assigned_rate_limits": { 00:17:43.833 "rw_ios_per_sec": 0, 00:17:43.833 "rw_mbytes_per_sec": 0, 00:17:43.833 "r_mbytes_per_sec": 0, 00:17:43.833 "w_mbytes_per_sec": 0 00:17:43.833 }, 00:17:43.833 "claimed": true, 00:17:43.833 "claim_type": "exclusive_write", 00:17:43.833 "zoned": false, 00:17:43.833 "supported_io_types": { 00:17:43.833 "read": true, 00:17:43.833 "write": true, 00:17:43.833 "unmap": true, 00:17:43.833 "flush": true, 00:17:43.833 "reset": true, 00:17:43.833 "nvme_admin": false, 00:17:43.833 "nvme_io": false, 00:17:43.833 "nvme_io_md": false, 00:17:43.833 "write_zeroes": true, 00:17:43.833 "zcopy": true, 00:17:43.833 "get_zone_info": false, 00:17:43.833 "zone_management": false, 00:17:43.834 "zone_append": false, 00:17:43.834 "compare": false, 00:17:43.834 "compare_and_write": false, 00:17:43.834 "abort": true, 00:17:43.834 "seek_hole": false, 00:17:43.834 "seek_data": false, 00:17:43.834 "copy": true, 00:17:43.834 "nvme_iov_md": false 00:17:43.834 }, 00:17:43.834 "memory_domains": [ 00:17:43.834 { 00:17:43.834 "dma_device_id": "system", 00:17:43.834 "dma_device_type": 1 00:17:43.834 }, 00:17:43.834 { 00:17:43.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.834 "dma_device_type": 2 00:17:43.834 } 00:17:43.834 ], 00:17:43.834 "driver_specific": {} 00:17:43.834 } 00:17:43.834 ] 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.834 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.091 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.091 "name": "Existed_Raid", 00:17:44.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.091 "strip_size_kb": 64, 00:17:44.091 "state": "configuring", 00:17:44.091 "raid_level": "raid0", 00:17:44.091 "superblock": false, 00:17:44.091 "num_base_bdevs": 4, 00:17:44.091 "num_base_bdevs_discovered": 3, 00:17:44.091 "num_base_bdevs_operational": 4, 00:17:44.091 "base_bdevs_list": [ 00:17:44.091 { 00:17:44.091 "name": "BaseBdev1", 00:17:44.091 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:44.091 "is_configured": true, 00:17:44.091 "data_offset": 0, 00:17:44.091 "data_size": 65536 00:17:44.091 }, 00:17:44.091 { 00:17:44.091 "name": "BaseBdev2", 00:17:44.091 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:44.091 "is_configured": true, 00:17:44.091 "data_offset": 0, 00:17:44.091 "data_size": 65536 00:17:44.091 }, 00:17:44.091 { 00:17:44.091 "name": "BaseBdev3", 00:17:44.091 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:44.091 "is_configured": true, 00:17:44.091 "data_offset": 0, 00:17:44.091 "data_size": 65536 00:17:44.091 }, 00:17:44.091 { 00:17:44.091 "name": "BaseBdev4", 00:17:44.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.091 "is_configured": false, 00:17:44.091 "data_offset": 0, 00:17:44.091 "data_size": 0 00:17:44.091 } 00:17:44.091 ] 00:17:44.091 }' 00:17:44.091 09:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.091 09:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.657 09:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:44.916 [2024-07-15 09:21:53.687721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:44.916 [2024-07-15 09:21:53.687758] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e19350 00:17:44.916 [2024-07-15 09:21:53.687767] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:44.916 [2024-07-15 09:21:53.688022] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e19020 00:17:44.916 [2024-07-15 09:21:53.688144] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e19350 00:17:44.916 [2024-07-15 09:21:53.688154] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e19350 00:17:44.916 [2024-07-15 09:21:53.688313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:44.916 BaseBdev4 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:44.916 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.174 09:21:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:45.433 [ 00:17:45.433 { 00:17:45.433 "name": "BaseBdev4", 00:17:45.433 "aliases": [ 00:17:45.433 "75c17d46-60d2-4edd-adc3-eee26b4e0fb3" 00:17:45.433 ], 00:17:45.433 "product_name": "Malloc disk", 00:17:45.433 "block_size": 512, 00:17:45.433 "num_blocks": 65536, 00:17:45.433 "uuid": "75c17d46-60d2-4edd-adc3-eee26b4e0fb3", 00:17:45.433 "assigned_rate_limits": { 00:17:45.433 "rw_ios_per_sec": 0, 00:17:45.433 "rw_mbytes_per_sec": 0, 00:17:45.433 "r_mbytes_per_sec": 0, 00:17:45.433 "w_mbytes_per_sec": 0 00:17:45.433 }, 00:17:45.433 "claimed": true, 00:17:45.433 "claim_type": "exclusive_write", 00:17:45.433 "zoned": false, 00:17:45.433 "supported_io_types": { 00:17:45.433 "read": true, 00:17:45.433 "write": true, 00:17:45.433 "unmap": true, 00:17:45.433 "flush": true, 00:17:45.433 "reset": true, 00:17:45.433 "nvme_admin": false, 00:17:45.433 "nvme_io": false, 00:17:45.433 "nvme_io_md": false, 00:17:45.433 "write_zeroes": true, 00:17:45.433 "zcopy": true, 00:17:45.433 "get_zone_info": false, 00:17:45.433 "zone_management": false, 00:17:45.433 "zone_append": false, 00:17:45.433 "compare": false, 00:17:45.433 "compare_and_write": false, 00:17:45.433 "abort": true, 00:17:45.433 "seek_hole": false, 00:17:45.433 "seek_data": false, 00:17:45.433 "copy": true, 00:17:45.433 "nvme_iov_md": false 00:17:45.433 }, 00:17:45.433 "memory_domains": [ 00:17:45.433 { 00:17:45.433 "dma_device_id": "system", 00:17:45.433 "dma_device_type": 1 00:17:45.433 }, 00:17:45.433 { 00:17:45.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.433 "dma_device_type": 2 00:17:45.433 } 00:17:45.433 ], 00:17:45.433 "driver_specific": {} 00:17:45.433 } 00:17:45.433 ] 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:45.433 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.434 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.692 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.692 "name": "Existed_Raid", 00:17:45.692 "uuid": "1792291c-0740-4db8-8ada-0da6c5380489", 00:17:45.692 "strip_size_kb": 64, 00:17:45.692 "state": "online", 00:17:45.692 "raid_level": "raid0", 00:17:45.692 "superblock": false, 00:17:45.692 "num_base_bdevs": 4, 00:17:45.692 "num_base_bdevs_discovered": 4, 00:17:45.692 "num_base_bdevs_operational": 4, 00:17:45.692 "base_bdevs_list": [ 00:17:45.692 { 00:17:45.692 "name": "BaseBdev1", 00:17:45.692 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:45.692 "is_configured": true, 00:17:45.692 "data_offset": 0, 00:17:45.692 "data_size": 65536 00:17:45.692 }, 00:17:45.692 { 00:17:45.692 "name": "BaseBdev2", 00:17:45.692 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:45.692 "is_configured": true, 00:17:45.692 "data_offset": 0, 00:17:45.692 "data_size": 65536 00:17:45.692 }, 00:17:45.692 { 00:17:45.692 "name": "BaseBdev3", 00:17:45.692 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:45.692 "is_configured": true, 00:17:45.692 "data_offset": 0, 00:17:45.692 "data_size": 65536 00:17:45.692 }, 00:17:45.692 { 00:17:45.692 "name": "BaseBdev4", 00:17:45.692 "uuid": "75c17d46-60d2-4edd-adc3-eee26b4e0fb3", 00:17:45.692 "is_configured": true, 00:17:45.692 "data_offset": 0, 00:17:45.692 "data_size": 65536 00:17:45.692 } 00:17:45.692 ] 00:17:45.692 }' 00:17:45.692 09:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.692 09:21:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:46.259 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:46.531 [2024-07-15 09:21:55.248164] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:46.531 "name": "Existed_Raid", 00:17:46.531 "aliases": [ 00:17:46.531 "1792291c-0740-4db8-8ada-0da6c5380489" 00:17:46.531 ], 00:17:46.531 "product_name": "Raid Volume", 00:17:46.531 "block_size": 512, 00:17:46.531 "num_blocks": 262144, 00:17:46.531 "uuid": "1792291c-0740-4db8-8ada-0da6c5380489", 00:17:46.531 "assigned_rate_limits": { 00:17:46.531 "rw_ios_per_sec": 0, 00:17:46.531 "rw_mbytes_per_sec": 0, 00:17:46.531 "r_mbytes_per_sec": 0, 00:17:46.531 "w_mbytes_per_sec": 0 00:17:46.531 }, 00:17:46.531 "claimed": false, 00:17:46.531 "zoned": false, 00:17:46.531 "supported_io_types": { 00:17:46.531 "read": true, 00:17:46.531 "write": true, 00:17:46.531 "unmap": true, 00:17:46.531 "flush": true, 00:17:46.531 "reset": true, 00:17:46.531 "nvme_admin": false, 00:17:46.531 "nvme_io": false, 00:17:46.531 "nvme_io_md": false, 00:17:46.531 "write_zeroes": true, 00:17:46.531 "zcopy": false, 00:17:46.531 "get_zone_info": false, 00:17:46.531 "zone_management": false, 00:17:46.531 "zone_append": false, 00:17:46.531 "compare": false, 00:17:46.531 "compare_and_write": false, 00:17:46.531 "abort": false, 00:17:46.531 "seek_hole": false, 00:17:46.531 "seek_data": false, 00:17:46.531 "copy": false, 00:17:46.531 "nvme_iov_md": false 00:17:46.531 }, 00:17:46.531 "memory_domains": [ 00:17:46.531 { 00:17:46.531 "dma_device_id": "system", 00:17:46.531 "dma_device_type": 1 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.531 "dma_device_type": 2 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "system", 00:17:46.531 "dma_device_type": 1 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.531 "dma_device_type": 2 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "system", 00:17:46.531 "dma_device_type": 1 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.531 "dma_device_type": 2 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "system", 00:17:46.531 "dma_device_type": 1 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.531 "dma_device_type": 2 00:17:46.531 } 00:17:46.531 ], 00:17:46.531 "driver_specific": { 00:17:46.531 "raid": { 00:17:46.531 "uuid": "1792291c-0740-4db8-8ada-0da6c5380489", 00:17:46.531 "strip_size_kb": 64, 00:17:46.531 "state": "online", 00:17:46.531 "raid_level": "raid0", 00:17:46.531 "superblock": false, 00:17:46.531 "num_base_bdevs": 4, 00:17:46.531 "num_base_bdevs_discovered": 4, 00:17:46.531 "num_base_bdevs_operational": 4, 00:17:46.531 "base_bdevs_list": [ 00:17:46.531 { 00:17:46.531 "name": "BaseBdev1", 00:17:46.531 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:46.531 "is_configured": true, 00:17:46.531 "data_offset": 0, 00:17:46.531 "data_size": 65536 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "name": "BaseBdev2", 00:17:46.531 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:46.531 "is_configured": true, 00:17:46.531 "data_offset": 0, 00:17:46.531 "data_size": 65536 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "name": "BaseBdev3", 00:17:46.531 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:46.531 "is_configured": true, 00:17:46.531 "data_offset": 0, 00:17:46.531 "data_size": 65536 00:17:46.531 }, 00:17:46.531 { 00:17:46.531 "name": "BaseBdev4", 00:17:46.531 "uuid": "75c17d46-60d2-4edd-adc3-eee26b4e0fb3", 00:17:46.531 "is_configured": true, 00:17:46.531 "data_offset": 0, 00:17:46.531 "data_size": 65536 00:17:46.531 } 00:17:46.531 ] 00:17:46.531 } 00:17:46.531 } 00:17:46.531 }' 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:46.531 BaseBdev2 00:17:46.531 BaseBdev3 00:17:46.531 BaseBdev4' 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.531 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.795 "name": "BaseBdev1", 00:17:46.795 "aliases": [ 00:17:46.795 "a1c15a68-b66c-479f-972b-0bd6366f16ec" 00:17:46.795 ], 00:17:46.795 "product_name": "Malloc disk", 00:17:46.795 "block_size": 512, 00:17:46.795 "num_blocks": 65536, 00:17:46.795 "uuid": "a1c15a68-b66c-479f-972b-0bd6366f16ec", 00:17:46.795 "assigned_rate_limits": { 00:17:46.795 "rw_ios_per_sec": 0, 00:17:46.795 "rw_mbytes_per_sec": 0, 00:17:46.795 "r_mbytes_per_sec": 0, 00:17:46.795 "w_mbytes_per_sec": 0 00:17:46.795 }, 00:17:46.795 "claimed": true, 00:17:46.795 "claim_type": "exclusive_write", 00:17:46.795 "zoned": false, 00:17:46.795 "supported_io_types": { 00:17:46.795 "read": true, 00:17:46.795 "write": true, 00:17:46.795 "unmap": true, 00:17:46.795 "flush": true, 00:17:46.795 "reset": true, 00:17:46.795 "nvme_admin": false, 00:17:46.795 "nvme_io": false, 00:17:46.795 "nvme_io_md": false, 00:17:46.795 "write_zeroes": true, 00:17:46.795 "zcopy": true, 00:17:46.795 "get_zone_info": false, 00:17:46.795 "zone_management": false, 00:17:46.795 "zone_append": false, 00:17:46.795 "compare": false, 00:17:46.795 "compare_and_write": false, 00:17:46.795 "abort": true, 00:17:46.795 "seek_hole": false, 00:17:46.795 "seek_data": false, 00:17:46.795 "copy": true, 00:17:46.795 "nvme_iov_md": false 00:17:46.795 }, 00:17:46.795 "memory_domains": [ 00:17:46.795 { 00:17:46.795 "dma_device_id": "system", 00:17:46.795 "dma_device_type": 1 00:17:46.795 }, 00:17:46.795 { 00:17:46.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.795 "dma_device_type": 2 00:17:46.795 } 00:17:46.795 ], 00:17:46.795 "driver_specific": {} 00:17:46.795 }' 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.795 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:47.054 09:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.312 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.312 "name": "BaseBdev2", 00:17:47.312 "aliases": [ 00:17:47.312 "3aca3d18-69a4-4154-aa52-6b1f1fc0107d" 00:17:47.312 ], 00:17:47.312 "product_name": "Malloc disk", 00:17:47.312 "block_size": 512, 00:17:47.312 "num_blocks": 65536, 00:17:47.312 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:47.312 "assigned_rate_limits": { 00:17:47.312 "rw_ios_per_sec": 0, 00:17:47.312 "rw_mbytes_per_sec": 0, 00:17:47.312 "r_mbytes_per_sec": 0, 00:17:47.312 "w_mbytes_per_sec": 0 00:17:47.312 }, 00:17:47.312 "claimed": true, 00:17:47.312 "claim_type": "exclusive_write", 00:17:47.312 "zoned": false, 00:17:47.312 "supported_io_types": { 00:17:47.312 "read": true, 00:17:47.312 "write": true, 00:17:47.312 "unmap": true, 00:17:47.312 "flush": true, 00:17:47.312 "reset": true, 00:17:47.312 "nvme_admin": false, 00:17:47.312 "nvme_io": false, 00:17:47.312 "nvme_io_md": false, 00:17:47.312 "write_zeroes": true, 00:17:47.312 "zcopy": true, 00:17:47.312 "get_zone_info": false, 00:17:47.312 "zone_management": false, 00:17:47.312 "zone_append": false, 00:17:47.313 "compare": false, 00:17:47.313 "compare_and_write": false, 00:17:47.313 "abort": true, 00:17:47.313 "seek_hole": false, 00:17:47.313 "seek_data": false, 00:17:47.313 "copy": true, 00:17:47.313 "nvme_iov_md": false 00:17:47.313 }, 00:17:47.313 "memory_domains": [ 00:17:47.313 { 00:17:47.313 "dma_device_id": "system", 00:17:47.313 "dma_device_type": 1 00:17:47.313 }, 00:17:47.313 { 00:17:47.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.313 "dma_device_type": 2 00:17:47.313 } 00:17:47.313 ], 00:17:47.313 "driver_specific": {} 00:17:47.313 }' 00:17:47.313 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.313 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.571 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.834 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.834 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.834 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:47.834 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.834 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.834 "name": "BaseBdev3", 00:17:47.834 "aliases": [ 00:17:47.834 "e118a769-415e-414a-9b4c-42222e15b1bf" 00:17:47.834 ], 00:17:47.834 "product_name": "Malloc disk", 00:17:47.834 "block_size": 512, 00:17:47.834 "num_blocks": 65536, 00:17:47.834 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:47.834 "assigned_rate_limits": { 00:17:47.834 "rw_ios_per_sec": 0, 00:17:47.834 "rw_mbytes_per_sec": 0, 00:17:47.834 "r_mbytes_per_sec": 0, 00:17:47.834 "w_mbytes_per_sec": 0 00:17:47.834 }, 00:17:47.834 "claimed": true, 00:17:47.834 "claim_type": "exclusive_write", 00:17:47.834 "zoned": false, 00:17:47.834 "supported_io_types": { 00:17:47.834 "read": true, 00:17:47.834 "write": true, 00:17:47.834 "unmap": true, 00:17:47.834 "flush": true, 00:17:47.835 "reset": true, 00:17:47.835 "nvme_admin": false, 00:17:47.835 "nvme_io": false, 00:17:47.835 "nvme_io_md": false, 00:17:47.835 "write_zeroes": true, 00:17:47.835 "zcopy": true, 00:17:47.835 "get_zone_info": false, 00:17:47.835 "zone_management": false, 00:17:47.835 "zone_append": false, 00:17:47.835 "compare": false, 00:17:47.835 "compare_and_write": false, 00:17:47.835 "abort": true, 00:17:47.835 "seek_hole": false, 00:17:47.835 "seek_data": false, 00:17:47.835 "copy": true, 00:17:47.835 "nvme_iov_md": false 00:17:47.835 }, 00:17:47.835 "memory_domains": [ 00:17:47.835 { 00:17:47.835 "dma_device_id": "system", 00:17:47.835 "dma_device_type": 1 00:17:47.835 }, 00:17:47.835 { 00:17:47.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.835 "dma_device_type": 2 00:17:47.835 } 00:17:47.835 ], 00:17:47.835 "driver_specific": {} 00:17:47.835 }' 00:17:47.835 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.835 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.835 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.835 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.097 09:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.097 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.097 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.097 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:48.097 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.356 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.356 "name": "BaseBdev4", 00:17:48.356 "aliases": [ 00:17:48.356 "75c17d46-60d2-4edd-adc3-eee26b4e0fb3" 00:17:48.356 ], 00:17:48.356 "product_name": "Malloc disk", 00:17:48.356 "block_size": 512, 00:17:48.356 "num_blocks": 65536, 00:17:48.356 "uuid": "75c17d46-60d2-4edd-adc3-eee26b4e0fb3", 00:17:48.356 "assigned_rate_limits": { 00:17:48.356 "rw_ios_per_sec": 0, 00:17:48.356 "rw_mbytes_per_sec": 0, 00:17:48.356 "r_mbytes_per_sec": 0, 00:17:48.356 "w_mbytes_per_sec": 0 00:17:48.356 }, 00:17:48.356 "claimed": true, 00:17:48.356 "claim_type": "exclusive_write", 00:17:48.356 "zoned": false, 00:17:48.356 "supported_io_types": { 00:17:48.356 "read": true, 00:17:48.356 "write": true, 00:17:48.356 "unmap": true, 00:17:48.356 "flush": true, 00:17:48.356 "reset": true, 00:17:48.356 "nvme_admin": false, 00:17:48.356 "nvme_io": false, 00:17:48.356 "nvme_io_md": false, 00:17:48.356 "write_zeroes": true, 00:17:48.356 "zcopy": true, 00:17:48.356 "get_zone_info": false, 00:17:48.356 "zone_management": false, 00:17:48.357 "zone_append": false, 00:17:48.357 "compare": false, 00:17:48.357 "compare_and_write": false, 00:17:48.357 "abort": true, 00:17:48.357 "seek_hole": false, 00:17:48.357 "seek_data": false, 00:17:48.357 "copy": true, 00:17:48.357 "nvme_iov_md": false 00:17:48.357 }, 00:17:48.357 "memory_domains": [ 00:17:48.357 { 00:17:48.357 "dma_device_id": "system", 00:17:48.357 "dma_device_type": 1 00:17:48.357 }, 00:17:48.357 { 00:17:48.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.357 "dma_device_type": 2 00:17:48.357 } 00:17:48.357 ], 00:17:48.357 "driver_specific": {} 00:17:48.357 }' 00:17:48.357 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.357 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.357 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.357 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.357 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.616 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:48.876 [2024-07-15 09:21:57.730476] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:48.876 [2024-07-15 09:21:57.730503] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:48.876 [2024-07-15 09:21:57.730549] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.876 09:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.135 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.135 "name": "Existed_Raid", 00:17:49.135 "uuid": "1792291c-0740-4db8-8ada-0da6c5380489", 00:17:49.135 "strip_size_kb": 64, 00:17:49.135 "state": "offline", 00:17:49.135 "raid_level": "raid0", 00:17:49.135 "superblock": false, 00:17:49.135 "num_base_bdevs": 4, 00:17:49.135 "num_base_bdevs_discovered": 3, 00:17:49.135 "num_base_bdevs_operational": 3, 00:17:49.135 "base_bdevs_list": [ 00:17:49.135 { 00:17:49.135 "name": null, 00:17:49.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.135 "is_configured": false, 00:17:49.135 "data_offset": 0, 00:17:49.135 "data_size": 65536 00:17:49.135 }, 00:17:49.135 { 00:17:49.135 "name": "BaseBdev2", 00:17:49.135 "uuid": "3aca3d18-69a4-4154-aa52-6b1f1fc0107d", 00:17:49.135 "is_configured": true, 00:17:49.135 "data_offset": 0, 00:17:49.135 "data_size": 65536 00:17:49.135 }, 00:17:49.135 { 00:17:49.135 "name": "BaseBdev3", 00:17:49.135 "uuid": "e118a769-415e-414a-9b4c-42222e15b1bf", 00:17:49.135 "is_configured": true, 00:17:49.135 "data_offset": 0, 00:17:49.135 "data_size": 65536 00:17:49.135 }, 00:17:49.135 { 00:17:49.135 "name": "BaseBdev4", 00:17:49.135 "uuid": "75c17d46-60d2-4edd-adc3-eee26b4e0fb3", 00:17:49.135 "is_configured": true, 00:17:49.135 "data_offset": 0, 00:17:49.135 "data_size": 65536 00:17:49.135 } 00:17:49.135 ] 00:17:49.135 }' 00:17:49.135 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.135 09:21:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.704 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:49.704 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:49.704 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.704 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:49.962 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:49.962 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:49.962 09:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:50.220 [2024-07-15 09:21:59.067034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:50.220 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:50.220 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:50.221 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.221 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:50.478 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:50.478 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:50.478 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:50.736 [2024-07-15 09:21:59.566961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:50.736 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:50.736 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:50.736 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.736 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:50.993 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:50.993 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:50.993 09:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:51.251 [2024-07-15 09:22:00.071836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:51.251 [2024-07-15 09:22:00.071889] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e19350 name Existed_Raid, state offline 00:17:51.251 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:51.251 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:51.251 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.251 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:51.509 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:51.766 BaseBdev2 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:51.766 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.024 09:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:52.282 [ 00:17:52.282 { 00:17:52.282 "name": "BaseBdev2", 00:17:52.282 "aliases": [ 00:17:52.282 "081319a8-85bc-438c-8cee-0604ca43e8aa" 00:17:52.282 ], 00:17:52.282 "product_name": "Malloc disk", 00:17:52.282 "block_size": 512, 00:17:52.282 "num_blocks": 65536, 00:17:52.282 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:52.282 "assigned_rate_limits": { 00:17:52.282 "rw_ios_per_sec": 0, 00:17:52.282 "rw_mbytes_per_sec": 0, 00:17:52.282 "r_mbytes_per_sec": 0, 00:17:52.282 "w_mbytes_per_sec": 0 00:17:52.282 }, 00:17:52.282 "claimed": false, 00:17:52.282 "zoned": false, 00:17:52.282 "supported_io_types": { 00:17:52.282 "read": true, 00:17:52.282 "write": true, 00:17:52.282 "unmap": true, 00:17:52.282 "flush": true, 00:17:52.282 "reset": true, 00:17:52.282 "nvme_admin": false, 00:17:52.282 "nvme_io": false, 00:17:52.282 "nvme_io_md": false, 00:17:52.282 "write_zeroes": true, 00:17:52.282 "zcopy": true, 00:17:52.282 "get_zone_info": false, 00:17:52.282 "zone_management": false, 00:17:52.282 "zone_append": false, 00:17:52.282 "compare": false, 00:17:52.282 "compare_and_write": false, 00:17:52.282 "abort": true, 00:17:52.282 "seek_hole": false, 00:17:52.282 "seek_data": false, 00:17:52.282 "copy": true, 00:17:52.282 "nvme_iov_md": false 00:17:52.282 }, 00:17:52.282 "memory_domains": [ 00:17:52.282 { 00:17:52.282 "dma_device_id": "system", 00:17:52.282 "dma_device_type": 1 00:17:52.282 }, 00:17:52.282 { 00:17:52.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.282 "dma_device_type": 2 00:17:52.282 } 00:17:52.282 ], 00:17:52.282 "driver_specific": {} 00:17:52.282 } 00:17:52.282 ] 00:17:52.282 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:52.282 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:52.282 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:52.282 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:52.541 BaseBdev3 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:52.541 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.799 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:53.057 [ 00:17:53.057 { 00:17:53.057 "name": "BaseBdev3", 00:17:53.057 "aliases": [ 00:17:53.057 "51776947-eed3-4ac7-909a-7d2c71fda526" 00:17:53.057 ], 00:17:53.057 "product_name": "Malloc disk", 00:17:53.057 "block_size": 512, 00:17:53.057 "num_blocks": 65536, 00:17:53.057 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:53.057 "assigned_rate_limits": { 00:17:53.057 "rw_ios_per_sec": 0, 00:17:53.057 "rw_mbytes_per_sec": 0, 00:17:53.057 "r_mbytes_per_sec": 0, 00:17:53.057 "w_mbytes_per_sec": 0 00:17:53.057 }, 00:17:53.057 "claimed": false, 00:17:53.057 "zoned": false, 00:17:53.057 "supported_io_types": { 00:17:53.057 "read": true, 00:17:53.057 "write": true, 00:17:53.057 "unmap": true, 00:17:53.057 "flush": true, 00:17:53.058 "reset": true, 00:17:53.058 "nvme_admin": false, 00:17:53.058 "nvme_io": false, 00:17:53.058 "nvme_io_md": false, 00:17:53.058 "write_zeroes": true, 00:17:53.058 "zcopy": true, 00:17:53.058 "get_zone_info": false, 00:17:53.058 "zone_management": false, 00:17:53.058 "zone_append": false, 00:17:53.058 "compare": false, 00:17:53.058 "compare_and_write": false, 00:17:53.058 "abort": true, 00:17:53.058 "seek_hole": false, 00:17:53.058 "seek_data": false, 00:17:53.058 "copy": true, 00:17:53.058 "nvme_iov_md": false 00:17:53.058 }, 00:17:53.058 "memory_domains": [ 00:17:53.058 { 00:17:53.058 "dma_device_id": "system", 00:17:53.058 "dma_device_type": 1 00:17:53.058 }, 00:17:53.058 { 00:17:53.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.058 "dma_device_type": 2 00:17:53.058 } 00:17:53.058 ], 00:17:53.058 "driver_specific": {} 00:17:53.058 } 00:17:53.058 ] 00:17:53.058 09:22:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:53.058 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:53.058 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:53.058 09:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:53.317 BaseBdev4 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.317 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:53.576 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:53.576 [ 00:17:53.576 { 00:17:53.576 "name": "BaseBdev4", 00:17:53.576 "aliases": [ 00:17:53.576 "4e720bac-d5c5-4f92-9ae1-22b63081daf7" 00:17:53.576 ], 00:17:53.576 "product_name": "Malloc disk", 00:17:53.576 "block_size": 512, 00:17:53.576 "num_blocks": 65536, 00:17:53.576 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:53.576 "assigned_rate_limits": { 00:17:53.576 "rw_ios_per_sec": 0, 00:17:53.576 "rw_mbytes_per_sec": 0, 00:17:53.576 "r_mbytes_per_sec": 0, 00:17:53.576 "w_mbytes_per_sec": 0 00:17:53.576 }, 00:17:53.576 "claimed": false, 00:17:53.576 "zoned": false, 00:17:53.576 "supported_io_types": { 00:17:53.576 "read": true, 00:17:53.576 "write": true, 00:17:53.576 "unmap": true, 00:17:53.576 "flush": true, 00:17:53.576 "reset": true, 00:17:53.576 "nvme_admin": false, 00:17:53.576 "nvme_io": false, 00:17:53.576 "nvme_io_md": false, 00:17:53.576 "write_zeroes": true, 00:17:53.576 "zcopy": true, 00:17:53.576 "get_zone_info": false, 00:17:53.576 "zone_management": false, 00:17:53.576 "zone_append": false, 00:17:53.576 "compare": false, 00:17:53.576 "compare_and_write": false, 00:17:53.576 "abort": true, 00:17:53.576 "seek_hole": false, 00:17:53.576 "seek_data": false, 00:17:53.576 "copy": true, 00:17:53.576 "nvme_iov_md": false 00:17:53.576 }, 00:17:53.576 "memory_domains": [ 00:17:53.576 { 00:17:53.576 "dma_device_id": "system", 00:17:53.576 "dma_device_type": 1 00:17:53.576 }, 00:17:53.576 { 00:17:53.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.576 "dma_device_type": 2 00:17:53.576 } 00:17:53.576 ], 00:17:53.576 "driver_specific": {} 00:17:53.576 } 00:17:53.576 ] 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:53.835 [2024-07-15 09:22:02.699010] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:53.835 [2024-07-15 09:22:02.699052] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:53.835 [2024-07-15 09:22:02.699070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:53.835 [2024-07-15 09:22:02.700424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:53.835 [2024-07-15 09:22:02.700464] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.835 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.095 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.095 "name": "Existed_Raid", 00:17:54.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.095 "strip_size_kb": 64, 00:17:54.095 "state": "configuring", 00:17:54.095 "raid_level": "raid0", 00:17:54.095 "superblock": false, 00:17:54.095 "num_base_bdevs": 4, 00:17:54.095 "num_base_bdevs_discovered": 3, 00:17:54.095 "num_base_bdevs_operational": 4, 00:17:54.095 "base_bdevs_list": [ 00:17:54.095 { 00:17:54.095 "name": "BaseBdev1", 00:17:54.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.095 "is_configured": false, 00:17:54.095 "data_offset": 0, 00:17:54.095 "data_size": 0 00:17:54.095 }, 00:17:54.095 { 00:17:54.095 "name": "BaseBdev2", 00:17:54.095 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:54.095 "is_configured": true, 00:17:54.095 "data_offset": 0, 00:17:54.095 "data_size": 65536 00:17:54.095 }, 00:17:54.095 { 00:17:54.095 "name": "BaseBdev3", 00:17:54.095 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:54.095 "is_configured": true, 00:17:54.095 "data_offset": 0, 00:17:54.095 "data_size": 65536 00:17:54.095 }, 00:17:54.095 { 00:17:54.095 "name": "BaseBdev4", 00:17:54.095 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:54.095 "is_configured": true, 00:17:54.095 "data_offset": 0, 00:17:54.095 "data_size": 65536 00:17:54.095 } 00:17:54.095 ] 00:17:54.095 }' 00:17:54.095 09:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.095 09:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.662 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:54.920 [2024-07-15 09:22:03.673553] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.920 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.921 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.921 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.921 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.921 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.180 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.180 "name": "Existed_Raid", 00:17:55.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.180 "strip_size_kb": 64, 00:17:55.180 "state": "configuring", 00:17:55.180 "raid_level": "raid0", 00:17:55.181 "superblock": false, 00:17:55.181 "num_base_bdevs": 4, 00:17:55.181 "num_base_bdevs_discovered": 2, 00:17:55.181 "num_base_bdevs_operational": 4, 00:17:55.181 "base_bdevs_list": [ 00:17:55.181 { 00:17:55.181 "name": "BaseBdev1", 00:17:55.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.181 "is_configured": false, 00:17:55.181 "data_offset": 0, 00:17:55.181 "data_size": 0 00:17:55.181 }, 00:17:55.181 { 00:17:55.181 "name": null, 00:17:55.181 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:55.181 "is_configured": false, 00:17:55.181 "data_offset": 0, 00:17:55.181 "data_size": 65536 00:17:55.181 }, 00:17:55.181 { 00:17:55.181 "name": "BaseBdev3", 00:17:55.181 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:55.181 "is_configured": true, 00:17:55.181 "data_offset": 0, 00:17:55.181 "data_size": 65536 00:17:55.181 }, 00:17:55.181 { 00:17:55.181 "name": "BaseBdev4", 00:17:55.181 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:55.181 "is_configured": true, 00:17:55.181 "data_offset": 0, 00:17:55.181 "data_size": 65536 00:17:55.181 } 00:17:55.181 ] 00:17:55.181 }' 00:17:55.181 09:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.181 09:22:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.749 09:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.749 09:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:56.008 09:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:56.008 09:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:56.267 [2024-07-15 09:22:04.997633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:56.267 BaseBdev1 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:56.267 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:56.536 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:56.795 [ 00:17:56.795 { 00:17:56.795 "name": "BaseBdev1", 00:17:56.795 "aliases": [ 00:17:56.795 "b8d3c8cc-5f42-41e1-98fd-95609e8afd23" 00:17:56.795 ], 00:17:56.795 "product_name": "Malloc disk", 00:17:56.795 "block_size": 512, 00:17:56.795 "num_blocks": 65536, 00:17:56.795 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:17:56.795 "assigned_rate_limits": { 00:17:56.795 "rw_ios_per_sec": 0, 00:17:56.795 "rw_mbytes_per_sec": 0, 00:17:56.795 "r_mbytes_per_sec": 0, 00:17:56.795 "w_mbytes_per_sec": 0 00:17:56.795 }, 00:17:56.795 "claimed": true, 00:17:56.795 "claim_type": "exclusive_write", 00:17:56.795 "zoned": false, 00:17:56.795 "supported_io_types": { 00:17:56.795 "read": true, 00:17:56.795 "write": true, 00:17:56.795 "unmap": true, 00:17:56.795 "flush": true, 00:17:56.795 "reset": true, 00:17:56.795 "nvme_admin": false, 00:17:56.795 "nvme_io": false, 00:17:56.795 "nvme_io_md": false, 00:17:56.795 "write_zeroes": true, 00:17:56.795 "zcopy": true, 00:17:56.795 "get_zone_info": false, 00:17:56.795 "zone_management": false, 00:17:56.795 "zone_append": false, 00:17:56.795 "compare": false, 00:17:56.795 "compare_and_write": false, 00:17:56.795 "abort": true, 00:17:56.795 "seek_hole": false, 00:17:56.795 "seek_data": false, 00:17:56.795 "copy": true, 00:17:56.795 "nvme_iov_md": false 00:17:56.795 }, 00:17:56.795 "memory_domains": [ 00:17:56.795 { 00:17:56.795 "dma_device_id": "system", 00:17:56.795 "dma_device_type": 1 00:17:56.795 }, 00:17:56.795 { 00:17:56.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.795 "dma_device_type": 2 00:17:56.795 } 00:17:56.795 ], 00:17:56.795 "driver_specific": {} 00:17:56.795 } 00:17:56.795 ] 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.795 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.054 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.054 "name": "Existed_Raid", 00:17:57.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.054 "strip_size_kb": 64, 00:17:57.054 "state": "configuring", 00:17:57.054 "raid_level": "raid0", 00:17:57.054 "superblock": false, 00:17:57.054 "num_base_bdevs": 4, 00:17:57.054 "num_base_bdevs_discovered": 3, 00:17:57.054 "num_base_bdevs_operational": 4, 00:17:57.054 "base_bdevs_list": [ 00:17:57.054 { 00:17:57.054 "name": "BaseBdev1", 00:17:57.054 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:17:57.054 "is_configured": true, 00:17:57.054 "data_offset": 0, 00:17:57.054 "data_size": 65536 00:17:57.054 }, 00:17:57.054 { 00:17:57.054 "name": null, 00:17:57.054 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:57.054 "is_configured": false, 00:17:57.054 "data_offset": 0, 00:17:57.054 "data_size": 65536 00:17:57.054 }, 00:17:57.054 { 00:17:57.054 "name": "BaseBdev3", 00:17:57.054 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:57.054 "is_configured": true, 00:17:57.054 "data_offset": 0, 00:17:57.054 "data_size": 65536 00:17:57.054 }, 00:17:57.054 { 00:17:57.054 "name": "BaseBdev4", 00:17:57.054 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:57.054 "is_configured": true, 00:17:57.054 "data_offset": 0, 00:17:57.054 "data_size": 65536 00:17:57.054 } 00:17:57.054 ] 00:17:57.054 }' 00:17:57.054 09:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.054 09:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.623 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.623 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:57.623 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:57.623 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:57.883 [2024-07-15 09:22:06.694345] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.883 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.143 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.143 "name": "Existed_Raid", 00:17:58.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.143 "strip_size_kb": 64, 00:17:58.143 "state": "configuring", 00:17:58.143 "raid_level": "raid0", 00:17:58.143 "superblock": false, 00:17:58.143 "num_base_bdevs": 4, 00:17:58.143 "num_base_bdevs_discovered": 2, 00:17:58.143 "num_base_bdevs_operational": 4, 00:17:58.143 "base_bdevs_list": [ 00:17:58.143 { 00:17:58.143 "name": "BaseBdev1", 00:17:58.143 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:17:58.143 "is_configured": true, 00:17:58.143 "data_offset": 0, 00:17:58.143 "data_size": 65536 00:17:58.143 }, 00:17:58.143 { 00:17:58.143 "name": null, 00:17:58.143 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:58.143 "is_configured": false, 00:17:58.143 "data_offset": 0, 00:17:58.143 "data_size": 65536 00:17:58.143 }, 00:17:58.143 { 00:17:58.143 "name": null, 00:17:58.143 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:58.143 "is_configured": false, 00:17:58.143 "data_offset": 0, 00:17:58.143 "data_size": 65536 00:17:58.143 }, 00:17:58.143 { 00:17:58.143 "name": "BaseBdev4", 00:17:58.143 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:58.143 "is_configured": true, 00:17:58.143 "data_offset": 0, 00:17:58.143 "data_size": 65536 00:17:58.143 } 00:17:58.143 ] 00:17:58.143 }' 00:17:58.143 09:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.143 09:22:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.712 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.712 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:58.971 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:58.972 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:59.231 [2024-07-15 09:22:07.949682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.231 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.232 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.232 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.232 09:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.491 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.491 "name": "Existed_Raid", 00:17:59.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.491 "strip_size_kb": 64, 00:17:59.491 "state": "configuring", 00:17:59.491 "raid_level": "raid0", 00:17:59.491 "superblock": false, 00:17:59.491 "num_base_bdevs": 4, 00:17:59.491 "num_base_bdevs_discovered": 3, 00:17:59.491 "num_base_bdevs_operational": 4, 00:17:59.491 "base_bdevs_list": [ 00:17:59.491 { 00:17:59.491 "name": "BaseBdev1", 00:17:59.491 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:17:59.491 "is_configured": true, 00:17:59.491 "data_offset": 0, 00:17:59.491 "data_size": 65536 00:17:59.491 }, 00:17:59.491 { 00:17:59.491 "name": null, 00:17:59.491 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:17:59.491 "is_configured": false, 00:17:59.491 "data_offset": 0, 00:17:59.491 "data_size": 65536 00:17:59.491 }, 00:17:59.491 { 00:17:59.491 "name": "BaseBdev3", 00:17:59.491 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:17:59.491 "is_configured": true, 00:17:59.491 "data_offset": 0, 00:17:59.491 "data_size": 65536 00:17:59.491 }, 00:17:59.491 { 00:17:59.491 "name": "BaseBdev4", 00:17:59.491 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:17:59.491 "is_configured": true, 00:17:59.491 "data_offset": 0, 00:17:59.491 "data_size": 65536 00:17:59.491 } 00:17:59.491 ] 00:17:59.491 }' 00:17:59.491 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.491 09:22:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.059 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.059 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:00.059 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:00.059 09:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:00.321 [2024-07-15 09:22:09.176944] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.321 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.627 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.627 "name": "Existed_Raid", 00:18:00.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.627 "strip_size_kb": 64, 00:18:00.627 "state": "configuring", 00:18:00.627 "raid_level": "raid0", 00:18:00.627 "superblock": false, 00:18:00.627 "num_base_bdevs": 4, 00:18:00.627 "num_base_bdevs_discovered": 2, 00:18:00.627 "num_base_bdevs_operational": 4, 00:18:00.627 "base_bdevs_list": [ 00:18:00.627 { 00:18:00.627 "name": null, 00:18:00.627 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:00.627 "is_configured": false, 00:18:00.627 "data_offset": 0, 00:18:00.627 "data_size": 65536 00:18:00.627 }, 00:18:00.627 { 00:18:00.627 "name": null, 00:18:00.627 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:18:00.627 "is_configured": false, 00:18:00.627 "data_offset": 0, 00:18:00.627 "data_size": 65536 00:18:00.627 }, 00:18:00.627 { 00:18:00.627 "name": "BaseBdev3", 00:18:00.627 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:18:00.627 "is_configured": true, 00:18:00.627 "data_offset": 0, 00:18:00.627 "data_size": 65536 00:18:00.627 }, 00:18:00.627 { 00:18:00.627 "name": "BaseBdev4", 00:18:00.627 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:18:00.627 "is_configured": true, 00:18:00.627 "data_offset": 0, 00:18:00.627 "data_size": 65536 00:18:00.627 } 00:18:00.627 ] 00:18:00.627 }' 00:18:00.627 09:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.627 09:22:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.209 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.209 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:01.468 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:01.468 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:01.727 [2024-07-15 09:22:10.581075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.727 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.987 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.987 "name": "Existed_Raid", 00:18:01.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.987 "strip_size_kb": 64, 00:18:01.987 "state": "configuring", 00:18:01.987 "raid_level": "raid0", 00:18:01.987 "superblock": false, 00:18:01.987 "num_base_bdevs": 4, 00:18:01.987 "num_base_bdevs_discovered": 3, 00:18:01.987 "num_base_bdevs_operational": 4, 00:18:01.987 "base_bdevs_list": [ 00:18:01.987 { 00:18:01.987 "name": null, 00:18:01.987 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:01.987 "is_configured": false, 00:18:01.987 "data_offset": 0, 00:18:01.987 "data_size": 65536 00:18:01.987 }, 00:18:01.987 { 00:18:01.987 "name": "BaseBdev2", 00:18:01.987 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:18:01.987 "is_configured": true, 00:18:01.987 "data_offset": 0, 00:18:01.987 "data_size": 65536 00:18:01.987 }, 00:18:01.987 { 00:18:01.987 "name": "BaseBdev3", 00:18:01.987 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:18:01.987 "is_configured": true, 00:18:01.987 "data_offset": 0, 00:18:01.987 "data_size": 65536 00:18:01.987 }, 00:18:01.987 { 00:18:01.987 "name": "BaseBdev4", 00:18:01.987 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:18:01.987 "is_configured": true, 00:18:01.987 "data_offset": 0, 00:18:01.987 "data_size": 65536 00:18:01.987 } 00:18:01.987 ] 00:18:01.987 }' 00:18:01.987 09:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.987 09:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.554 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.554 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:02.815 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:02.815 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.815 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:03.074 09:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b8d3c8cc-5f42-41e1-98fd-95609e8afd23 00:18:03.074 [2024-07-15 09:22:12.005351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:03.074 [2024-07-15 09:22:12.005389] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e1d040 00:18:03.074 [2024-07-15 09:22:12.005397] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:03.074 [2024-07-15 09:22:12.005589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e18a70 00:18:03.074 [2024-07-15 09:22:12.005707] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e1d040 00:18:03.074 [2024-07-15 09:22:12.005717] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e1d040 00:18:03.074 [2024-07-15 09:22:12.005875] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:03.074 NewBaseBdev 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:03.074 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:03.332 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:03.590 [ 00:18:03.590 { 00:18:03.590 "name": "NewBaseBdev", 00:18:03.590 "aliases": [ 00:18:03.590 "b8d3c8cc-5f42-41e1-98fd-95609e8afd23" 00:18:03.590 ], 00:18:03.590 "product_name": "Malloc disk", 00:18:03.590 "block_size": 512, 00:18:03.590 "num_blocks": 65536, 00:18:03.590 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:03.590 "assigned_rate_limits": { 00:18:03.590 "rw_ios_per_sec": 0, 00:18:03.590 "rw_mbytes_per_sec": 0, 00:18:03.590 "r_mbytes_per_sec": 0, 00:18:03.590 "w_mbytes_per_sec": 0 00:18:03.590 }, 00:18:03.590 "claimed": true, 00:18:03.590 "claim_type": "exclusive_write", 00:18:03.590 "zoned": false, 00:18:03.590 "supported_io_types": { 00:18:03.590 "read": true, 00:18:03.590 "write": true, 00:18:03.590 "unmap": true, 00:18:03.590 "flush": true, 00:18:03.590 "reset": true, 00:18:03.590 "nvme_admin": false, 00:18:03.590 "nvme_io": false, 00:18:03.590 "nvme_io_md": false, 00:18:03.590 "write_zeroes": true, 00:18:03.590 "zcopy": true, 00:18:03.590 "get_zone_info": false, 00:18:03.590 "zone_management": false, 00:18:03.590 "zone_append": false, 00:18:03.590 "compare": false, 00:18:03.590 "compare_and_write": false, 00:18:03.590 "abort": true, 00:18:03.590 "seek_hole": false, 00:18:03.590 "seek_data": false, 00:18:03.590 "copy": true, 00:18:03.590 "nvme_iov_md": false 00:18:03.590 }, 00:18:03.590 "memory_domains": [ 00:18:03.590 { 00:18:03.590 "dma_device_id": "system", 00:18:03.590 "dma_device_type": 1 00:18:03.590 }, 00:18:03.590 { 00:18:03.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.590 "dma_device_type": 2 00:18:03.590 } 00:18:03.590 ], 00:18:03.590 "driver_specific": {} 00:18:03.590 } 00:18:03.590 ] 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.590 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.849 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.849 "name": "Existed_Raid", 00:18:03.849 "uuid": "6057f0b8-b35f-4d49-a908-44d5b1d977a6", 00:18:03.849 "strip_size_kb": 64, 00:18:03.849 "state": "online", 00:18:03.849 "raid_level": "raid0", 00:18:03.849 "superblock": false, 00:18:03.849 "num_base_bdevs": 4, 00:18:03.849 "num_base_bdevs_discovered": 4, 00:18:03.849 "num_base_bdevs_operational": 4, 00:18:03.849 "base_bdevs_list": [ 00:18:03.849 { 00:18:03.849 "name": "NewBaseBdev", 00:18:03.849 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:03.849 "is_configured": true, 00:18:03.849 "data_offset": 0, 00:18:03.849 "data_size": 65536 00:18:03.849 }, 00:18:03.849 { 00:18:03.849 "name": "BaseBdev2", 00:18:03.849 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:18:03.849 "is_configured": true, 00:18:03.849 "data_offset": 0, 00:18:03.849 "data_size": 65536 00:18:03.849 }, 00:18:03.849 { 00:18:03.849 "name": "BaseBdev3", 00:18:03.849 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:18:03.849 "is_configured": true, 00:18:03.849 "data_offset": 0, 00:18:03.849 "data_size": 65536 00:18:03.849 }, 00:18:03.849 { 00:18:03.849 "name": "BaseBdev4", 00:18:03.849 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:18:03.849 "is_configured": true, 00:18:03.849 "data_offset": 0, 00:18:03.849 "data_size": 65536 00:18:03.849 } 00:18:03.849 ] 00:18:03.849 }' 00:18:03.849 09:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.849 09:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:04.784 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:05.042 [2024-07-15 09:22:13.898693] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:05.042 "name": "Existed_Raid", 00:18:05.042 "aliases": [ 00:18:05.042 "6057f0b8-b35f-4d49-a908-44d5b1d977a6" 00:18:05.042 ], 00:18:05.042 "product_name": "Raid Volume", 00:18:05.042 "block_size": 512, 00:18:05.042 "num_blocks": 262144, 00:18:05.042 "uuid": "6057f0b8-b35f-4d49-a908-44d5b1d977a6", 00:18:05.042 "assigned_rate_limits": { 00:18:05.042 "rw_ios_per_sec": 0, 00:18:05.042 "rw_mbytes_per_sec": 0, 00:18:05.042 "r_mbytes_per_sec": 0, 00:18:05.042 "w_mbytes_per_sec": 0 00:18:05.042 }, 00:18:05.042 "claimed": false, 00:18:05.042 "zoned": false, 00:18:05.042 "supported_io_types": { 00:18:05.042 "read": true, 00:18:05.042 "write": true, 00:18:05.042 "unmap": true, 00:18:05.042 "flush": true, 00:18:05.042 "reset": true, 00:18:05.042 "nvme_admin": false, 00:18:05.042 "nvme_io": false, 00:18:05.042 "nvme_io_md": false, 00:18:05.042 "write_zeroes": true, 00:18:05.042 "zcopy": false, 00:18:05.042 "get_zone_info": false, 00:18:05.042 "zone_management": false, 00:18:05.042 "zone_append": false, 00:18:05.042 "compare": false, 00:18:05.042 "compare_and_write": false, 00:18:05.042 "abort": false, 00:18:05.042 "seek_hole": false, 00:18:05.042 "seek_data": false, 00:18:05.042 "copy": false, 00:18:05.042 "nvme_iov_md": false 00:18:05.042 }, 00:18:05.042 "memory_domains": [ 00:18:05.042 { 00:18:05.042 "dma_device_id": "system", 00:18:05.042 "dma_device_type": 1 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.042 "dma_device_type": 2 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "system", 00:18:05.042 "dma_device_type": 1 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.042 "dma_device_type": 2 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "system", 00:18:05.042 "dma_device_type": 1 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.042 "dma_device_type": 2 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "system", 00:18:05.042 "dma_device_type": 1 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.042 "dma_device_type": 2 00:18:05.042 } 00:18:05.042 ], 00:18:05.042 "driver_specific": { 00:18:05.042 "raid": { 00:18:05.042 "uuid": "6057f0b8-b35f-4d49-a908-44d5b1d977a6", 00:18:05.042 "strip_size_kb": 64, 00:18:05.042 "state": "online", 00:18:05.042 "raid_level": "raid0", 00:18:05.042 "superblock": false, 00:18:05.042 "num_base_bdevs": 4, 00:18:05.042 "num_base_bdevs_discovered": 4, 00:18:05.042 "num_base_bdevs_operational": 4, 00:18:05.042 "base_bdevs_list": [ 00:18:05.042 { 00:18:05.042 "name": "NewBaseBdev", 00:18:05.042 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:05.042 "is_configured": true, 00:18:05.042 "data_offset": 0, 00:18:05.042 "data_size": 65536 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "name": "BaseBdev2", 00:18:05.042 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:18:05.042 "is_configured": true, 00:18:05.042 "data_offset": 0, 00:18:05.042 "data_size": 65536 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "name": "BaseBdev3", 00:18:05.042 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:18:05.042 "is_configured": true, 00:18:05.042 "data_offset": 0, 00:18:05.042 "data_size": 65536 00:18:05.042 }, 00:18:05.042 { 00:18:05.042 "name": "BaseBdev4", 00:18:05.042 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:18:05.042 "is_configured": true, 00:18:05.042 "data_offset": 0, 00:18:05.042 "data_size": 65536 00:18:05.042 } 00:18:05.042 ] 00:18:05.042 } 00:18:05.042 } 00:18:05.042 }' 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:05.042 BaseBdev2 00:18:05.042 BaseBdev3 00:18:05.042 BaseBdev4' 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:05.042 09:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.301 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.301 "name": "NewBaseBdev", 00:18:05.301 "aliases": [ 00:18:05.301 "b8d3c8cc-5f42-41e1-98fd-95609e8afd23" 00:18:05.301 ], 00:18:05.301 "product_name": "Malloc disk", 00:18:05.301 "block_size": 512, 00:18:05.301 "num_blocks": 65536, 00:18:05.301 "uuid": "b8d3c8cc-5f42-41e1-98fd-95609e8afd23", 00:18:05.301 "assigned_rate_limits": { 00:18:05.301 "rw_ios_per_sec": 0, 00:18:05.301 "rw_mbytes_per_sec": 0, 00:18:05.301 "r_mbytes_per_sec": 0, 00:18:05.301 "w_mbytes_per_sec": 0 00:18:05.301 }, 00:18:05.301 "claimed": true, 00:18:05.301 "claim_type": "exclusive_write", 00:18:05.301 "zoned": false, 00:18:05.301 "supported_io_types": { 00:18:05.301 "read": true, 00:18:05.301 "write": true, 00:18:05.301 "unmap": true, 00:18:05.301 "flush": true, 00:18:05.301 "reset": true, 00:18:05.301 "nvme_admin": false, 00:18:05.301 "nvme_io": false, 00:18:05.301 "nvme_io_md": false, 00:18:05.301 "write_zeroes": true, 00:18:05.301 "zcopy": true, 00:18:05.301 "get_zone_info": false, 00:18:05.301 "zone_management": false, 00:18:05.301 "zone_append": false, 00:18:05.301 "compare": false, 00:18:05.301 "compare_and_write": false, 00:18:05.301 "abort": true, 00:18:05.301 "seek_hole": false, 00:18:05.301 "seek_data": false, 00:18:05.301 "copy": true, 00:18:05.301 "nvme_iov_md": false 00:18:05.301 }, 00:18:05.301 "memory_domains": [ 00:18:05.301 { 00:18:05.301 "dma_device_id": "system", 00:18:05.301 "dma_device_type": 1 00:18:05.301 }, 00:18:05.301 { 00:18:05.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.301 "dma_device_type": 2 00:18:05.301 } 00:18:05.301 ], 00:18:05.301 "driver_specific": {} 00:18:05.301 }' 00:18:05.301 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.301 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.560 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.819 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.819 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.819 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:05.819 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:06.079 "name": "BaseBdev2", 00:18:06.079 "aliases": [ 00:18:06.079 "081319a8-85bc-438c-8cee-0604ca43e8aa" 00:18:06.079 ], 00:18:06.079 "product_name": "Malloc disk", 00:18:06.079 "block_size": 512, 00:18:06.079 "num_blocks": 65536, 00:18:06.079 "uuid": "081319a8-85bc-438c-8cee-0604ca43e8aa", 00:18:06.079 "assigned_rate_limits": { 00:18:06.079 "rw_ios_per_sec": 0, 00:18:06.079 "rw_mbytes_per_sec": 0, 00:18:06.079 "r_mbytes_per_sec": 0, 00:18:06.079 "w_mbytes_per_sec": 0 00:18:06.079 }, 00:18:06.079 "claimed": true, 00:18:06.079 "claim_type": "exclusive_write", 00:18:06.079 "zoned": false, 00:18:06.079 "supported_io_types": { 00:18:06.079 "read": true, 00:18:06.079 "write": true, 00:18:06.079 "unmap": true, 00:18:06.079 "flush": true, 00:18:06.079 "reset": true, 00:18:06.079 "nvme_admin": false, 00:18:06.079 "nvme_io": false, 00:18:06.079 "nvme_io_md": false, 00:18:06.079 "write_zeroes": true, 00:18:06.079 "zcopy": true, 00:18:06.079 "get_zone_info": false, 00:18:06.079 "zone_management": false, 00:18:06.079 "zone_append": false, 00:18:06.079 "compare": false, 00:18:06.079 "compare_and_write": false, 00:18:06.079 "abort": true, 00:18:06.079 "seek_hole": false, 00:18:06.079 "seek_data": false, 00:18:06.079 "copy": true, 00:18:06.079 "nvme_iov_md": false 00:18:06.079 }, 00:18:06.079 "memory_domains": [ 00:18:06.079 { 00:18:06.079 "dma_device_id": "system", 00:18:06.079 "dma_device_type": 1 00:18:06.079 }, 00:18:06.079 { 00:18:06.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.079 "dma_device_type": 2 00:18:06.079 } 00:18:06.079 ], 00:18:06.079 "driver_specific": {} 00:18:06.079 }' 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.079 09:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:06.339 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:06.599 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:06.599 "name": "BaseBdev3", 00:18:06.599 "aliases": [ 00:18:06.599 "51776947-eed3-4ac7-909a-7d2c71fda526" 00:18:06.599 ], 00:18:06.599 "product_name": "Malloc disk", 00:18:06.599 "block_size": 512, 00:18:06.599 "num_blocks": 65536, 00:18:06.599 "uuid": "51776947-eed3-4ac7-909a-7d2c71fda526", 00:18:06.599 "assigned_rate_limits": { 00:18:06.599 "rw_ios_per_sec": 0, 00:18:06.599 "rw_mbytes_per_sec": 0, 00:18:06.599 "r_mbytes_per_sec": 0, 00:18:06.599 "w_mbytes_per_sec": 0 00:18:06.599 }, 00:18:06.599 "claimed": true, 00:18:06.599 "claim_type": "exclusive_write", 00:18:06.599 "zoned": false, 00:18:06.599 "supported_io_types": { 00:18:06.599 "read": true, 00:18:06.599 "write": true, 00:18:06.599 "unmap": true, 00:18:06.599 "flush": true, 00:18:06.599 "reset": true, 00:18:06.599 "nvme_admin": false, 00:18:06.599 "nvme_io": false, 00:18:06.599 "nvme_io_md": false, 00:18:06.599 "write_zeroes": true, 00:18:06.599 "zcopy": true, 00:18:06.599 "get_zone_info": false, 00:18:06.599 "zone_management": false, 00:18:06.599 "zone_append": false, 00:18:06.599 "compare": false, 00:18:06.599 "compare_and_write": false, 00:18:06.599 "abort": true, 00:18:06.599 "seek_hole": false, 00:18:06.599 "seek_data": false, 00:18:06.599 "copy": true, 00:18:06.599 "nvme_iov_md": false 00:18:06.599 }, 00:18:06.599 "memory_domains": [ 00:18:06.599 { 00:18:06.599 "dma_device_id": "system", 00:18:06.599 "dma_device_type": 1 00:18:06.599 }, 00:18:06.599 { 00:18:06.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.600 "dma_device_type": 2 00:18:06.600 } 00:18:06.600 ], 00:18:06.600 "driver_specific": {} 00:18:06.600 }' 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.600 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:06.859 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:07.118 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:07.118 "name": "BaseBdev4", 00:18:07.118 "aliases": [ 00:18:07.118 "4e720bac-d5c5-4f92-9ae1-22b63081daf7" 00:18:07.118 ], 00:18:07.118 "product_name": "Malloc disk", 00:18:07.118 "block_size": 512, 00:18:07.118 "num_blocks": 65536, 00:18:07.118 "uuid": "4e720bac-d5c5-4f92-9ae1-22b63081daf7", 00:18:07.118 "assigned_rate_limits": { 00:18:07.118 "rw_ios_per_sec": 0, 00:18:07.118 "rw_mbytes_per_sec": 0, 00:18:07.118 "r_mbytes_per_sec": 0, 00:18:07.118 "w_mbytes_per_sec": 0 00:18:07.118 }, 00:18:07.118 "claimed": true, 00:18:07.118 "claim_type": "exclusive_write", 00:18:07.118 "zoned": false, 00:18:07.118 "supported_io_types": { 00:18:07.118 "read": true, 00:18:07.118 "write": true, 00:18:07.118 "unmap": true, 00:18:07.118 "flush": true, 00:18:07.118 "reset": true, 00:18:07.118 "nvme_admin": false, 00:18:07.118 "nvme_io": false, 00:18:07.118 "nvme_io_md": false, 00:18:07.118 "write_zeroes": true, 00:18:07.118 "zcopy": true, 00:18:07.118 "get_zone_info": false, 00:18:07.118 "zone_management": false, 00:18:07.118 "zone_append": false, 00:18:07.118 "compare": false, 00:18:07.118 "compare_and_write": false, 00:18:07.118 "abort": true, 00:18:07.118 "seek_hole": false, 00:18:07.118 "seek_data": false, 00:18:07.118 "copy": true, 00:18:07.118 "nvme_iov_md": false 00:18:07.118 }, 00:18:07.118 "memory_domains": [ 00:18:07.118 { 00:18:07.118 "dma_device_id": "system", 00:18:07.118 "dma_device_type": 1 00:18:07.118 }, 00:18:07.118 { 00:18:07.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.118 "dma_device_type": 2 00:18:07.118 } 00:18:07.118 ], 00:18:07.118 "driver_specific": {} 00:18:07.118 }' 00:18:07.118 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.118 09:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.118 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:07.118 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.118 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:07.385 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:07.646 [2024-07-15 09:22:16.477233] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:07.646 [2024-07-15 09:22:16.477259] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:07.646 [2024-07-15 09:22:16.477311] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:07.646 [2024-07-15 09:22:16.477370] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:07.646 [2024-07-15 09:22:16.477383] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e1d040 name Existed_Raid, state offline 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 145408 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 145408 ']' 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 145408 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 145408 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 145408' 00:18:07.646 killing process with pid 145408 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 145408 00:18:07.646 [2024-07-15 09:22:16.528586] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:07.646 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 145408 00:18:07.646 [2024-07-15 09:22:16.570726] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:07.904 09:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:07.904 00:18:07.904 real 0m32.068s 00:18:07.904 user 0m58.893s 00:18:07.904 sys 0m5.686s 00:18:07.904 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:07.904 09:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.904 ************************************ 00:18:07.904 END TEST raid_state_function_test 00:18:07.904 ************************************ 00:18:07.904 09:22:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:07.904 09:22:16 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:07.904 09:22:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:07.904 09:22:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:07.904 09:22:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:08.163 ************************************ 00:18:08.163 START TEST raid_state_function_test_sb 00:18:08.163 ************************************ 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=150175 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 150175' 00:18:08.163 Process raid pid: 150175 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 150175 /var/tmp/spdk-raid.sock 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 150175 ']' 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:08.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:08.163 09:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.163 [2024-07-15 09:22:16.944562] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:18:08.163 [2024-07-15 09:22:16.944630] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:08.163 [2024-07-15 09:22:17.075669] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:08.422 [2024-07-15 09:22:17.179402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:08.422 [2024-07-15 09:22:17.240675] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.422 [2024-07-15 09:22:17.240707] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.988 09:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:08.988 09:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:08.988 09:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:09.247 [2024-07-15 09:22:18.097219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:09.247 [2024-07-15 09:22:18.097262] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:09.247 [2024-07-15 09:22:18.097273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:09.247 [2024-07-15 09:22:18.097284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:09.247 [2024-07-15 09:22:18.097293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:09.247 [2024-07-15 09:22:18.097305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:09.247 [2024-07-15 09:22:18.097314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:09.247 [2024-07-15 09:22:18.097324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.247 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.506 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.506 "name": "Existed_Raid", 00:18:09.506 "uuid": "43108081-c2c3-440f-a5f7-d174e8020a9a", 00:18:09.506 "strip_size_kb": 64, 00:18:09.506 "state": "configuring", 00:18:09.506 "raid_level": "raid0", 00:18:09.506 "superblock": true, 00:18:09.506 "num_base_bdevs": 4, 00:18:09.506 "num_base_bdevs_discovered": 0, 00:18:09.506 "num_base_bdevs_operational": 4, 00:18:09.506 "base_bdevs_list": [ 00:18:09.506 { 00:18:09.506 "name": "BaseBdev1", 00:18:09.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.506 "is_configured": false, 00:18:09.506 "data_offset": 0, 00:18:09.506 "data_size": 0 00:18:09.506 }, 00:18:09.506 { 00:18:09.506 "name": "BaseBdev2", 00:18:09.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.506 "is_configured": false, 00:18:09.506 "data_offset": 0, 00:18:09.506 "data_size": 0 00:18:09.506 }, 00:18:09.506 { 00:18:09.506 "name": "BaseBdev3", 00:18:09.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.506 "is_configured": false, 00:18:09.506 "data_offset": 0, 00:18:09.507 "data_size": 0 00:18:09.507 }, 00:18:09.507 { 00:18:09.507 "name": "BaseBdev4", 00:18:09.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.507 "is_configured": false, 00:18:09.507 "data_offset": 0, 00:18:09.507 "data_size": 0 00:18:09.507 } 00:18:09.507 ] 00:18:09.507 }' 00:18:09.507 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.507 09:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.073 09:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:10.332 [2024-07-15 09:22:19.175941] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:10.332 [2024-07-15 09:22:19.175973] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc8aa0 name Existed_Raid, state configuring 00:18:10.332 09:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:10.591 [2024-07-15 09:22:19.404579] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:10.591 [2024-07-15 09:22:19.404608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:10.591 [2024-07-15 09:22:19.404618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:10.591 [2024-07-15 09:22:19.404629] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:10.591 [2024-07-15 09:22:19.404638] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:10.591 [2024-07-15 09:22:19.404649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:10.591 [2024-07-15 09:22:19.404658] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:10.591 [2024-07-15 09:22:19.404669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:10.591 09:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:10.850 [2024-07-15 09:22:19.656370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:10.850 BaseBdev1 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:10.850 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.109 09:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:11.369 [ 00:18:11.369 { 00:18:11.369 "name": "BaseBdev1", 00:18:11.369 "aliases": [ 00:18:11.369 "f2490c29-f9c9-4b00-998b-2d21a456e622" 00:18:11.369 ], 00:18:11.369 "product_name": "Malloc disk", 00:18:11.369 "block_size": 512, 00:18:11.369 "num_blocks": 65536, 00:18:11.369 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:11.369 "assigned_rate_limits": { 00:18:11.369 "rw_ios_per_sec": 0, 00:18:11.369 "rw_mbytes_per_sec": 0, 00:18:11.369 "r_mbytes_per_sec": 0, 00:18:11.369 "w_mbytes_per_sec": 0 00:18:11.369 }, 00:18:11.369 "claimed": true, 00:18:11.369 "claim_type": "exclusive_write", 00:18:11.369 "zoned": false, 00:18:11.369 "supported_io_types": { 00:18:11.369 "read": true, 00:18:11.369 "write": true, 00:18:11.369 "unmap": true, 00:18:11.369 "flush": true, 00:18:11.369 "reset": true, 00:18:11.369 "nvme_admin": false, 00:18:11.369 "nvme_io": false, 00:18:11.369 "nvme_io_md": false, 00:18:11.369 "write_zeroes": true, 00:18:11.369 "zcopy": true, 00:18:11.369 "get_zone_info": false, 00:18:11.369 "zone_management": false, 00:18:11.369 "zone_append": false, 00:18:11.369 "compare": false, 00:18:11.369 "compare_and_write": false, 00:18:11.369 "abort": true, 00:18:11.369 "seek_hole": false, 00:18:11.369 "seek_data": false, 00:18:11.369 "copy": true, 00:18:11.369 "nvme_iov_md": false 00:18:11.369 }, 00:18:11.369 "memory_domains": [ 00:18:11.369 { 00:18:11.369 "dma_device_id": "system", 00:18:11.369 "dma_device_type": 1 00:18:11.369 }, 00:18:11.369 { 00:18:11.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.369 "dma_device_type": 2 00:18:11.369 } 00:18:11.369 ], 00:18:11.369 "driver_specific": {} 00:18:11.369 } 00:18:11.369 ] 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.369 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.936 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.936 "name": "Existed_Raid", 00:18:11.936 "uuid": "f2c96541-c53e-4cb4-b018-cc52fbc76b4f", 00:18:11.936 "strip_size_kb": 64, 00:18:11.936 "state": "configuring", 00:18:11.936 "raid_level": "raid0", 00:18:11.936 "superblock": true, 00:18:11.936 "num_base_bdevs": 4, 00:18:11.936 "num_base_bdevs_discovered": 1, 00:18:11.936 "num_base_bdevs_operational": 4, 00:18:11.936 "base_bdevs_list": [ 00:18:11.936 { 00:18:11.936 "name": "BaseBdev1", 00:18:11.936 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:11.936 "is_configured": true, 00:18:11.936 "data_offset": 2048, 00:18:11.936 "data_size": 63488 00:18:11.936 }, 00:18:11.936 { 00:18:11.936 "name": "BaseBdev2", 00:18:11.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.936 "is_configured": false, 00:18:11.936 "data_offset": 0, 00:18:11.936 "data_size": 0 00:18:11.936 }, 00:18:11.936 { 00:18:11.936 "name": "BaseBdev3", 00:18:11.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.936 "is_configured": false, 00:18:11.936 "data_offset": 0, 00:18:11.936 "data_size": 0 00:18:11.936 }, 00:18:11.936 { 00:18:11.936 "name": "BaseBdev4", 00:18:11.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.936 "is_configured": false, 00:18:11.936 "data_offset": 0, 00:18:11.936 "data_size": 0 00:18:11.936 } 00:18:11.936 ] 00:18:11.936 }' 00:18:11.936 09:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.936 09:22:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.502 09:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:12.762 [2024-07-15 09:22:21.481401] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:12.762 [2024-07-15 09:22:21.481449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc8310 name Existed_Raid, state configuring 00:18:12.762 09:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:13.329 [2024-07-15 09:22:21.978763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:13.329 [2024-07-15 09:22:21.980275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:13.329 [2024-07-15 09:22:21.980314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:13.329 [2024-07-15 09:22:21.980325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:13.329 [2024-07-15 09:22:21.980338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:13.329 [2024-07-15 09:22:21.980347] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:13.329 [2024-07-15 09:22:21.980358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.329 "name": "Existed_Raid", 00:18:13.329 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:13.329 "strip_size_kb": 64, 00:18:13.329 "state": "configuring", 00:18:13.329 "raid_level": "raid0", 00:18:13.329 "superblock": true, 00:18:13.329 "num_base_bdevs": 4, 00:18:13.329 "num_base_bdevs_discovered": 1, 00:18:13.329 "num_base_bdevs_operational": 4, 00:18:13.329 "base_bdevs_list": [ 00:18:13.329 { 00:18:13.329 "name": "BaseBdev1", 00:18:13.329 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:13.329 "is_configured": true, 00:18:13.329 "data_offset": 2048, 00:18:13.329 "data_size": 63488 00:18:13.329 }, 00:18:13.329 { 00:18:13.329 "name": "BaseBdev2", 00:18:13.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.329 "is_configured": false, 00:18:13.329 "data_offset": 0, 00:18:13.329 "data_size": 0 00:18:13.329 }, 00:18:13.329 { 00:18:13.329 "name": "BaseBdev3", 00:18:13.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.329 "is_configured": false, 00:18:13.329 "data_offset": 0, 00:18:13.329 "data_size": 0 00:18:13.329 }, 00:18:13.329 { 00:18:13.329 "name": "BaseBdev4", 00:18:13.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.329 "is_configured": false, 00:18:13.329 "data_offset": 0, 00:18:13.329 "data_size": 0 00:18:13.329 } 00:18:13.329 ] 00:18:13.329 }' 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.329 09:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.896 09:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:14.155 [2024-07-15 09:22:23.081044] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.155 BaseBdev2 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:14.155 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.462 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:14.721 [ 00:18:14.721 { 00:18:14.721 "name": "BaseBdev2", 00:18:14.721 "aliases": [ 00:18:14.721 "59b81aa8-7f61-44fb-9729-bc08e143c59e" 00:18:14.721 ], 00:18:14.721 "product_name": "Malloc disk", 00:18:14.721 "block_size": 512, 00:18:14.721 "num_blocks": 65536, 00:18:14.721 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:14.721 "assigned_rate_limits": { 00:18:14.721 "rw_ios_per_sec": 0, 00:18:14.721 "rw_mbytes_per_sec": 0, 00:18:14.721 "r_mbytes_per_sec": 0, 00:18:14.721 "w_mbytes_per_sec": 0 00:18:14.721 }, 00:18:14.721 "claimed": true, 00:18:14.721 "claim_type": "exclusive_write", 00:18:14.721 "zoned": false, 00:18:14.721 "supported_io_types": { 00:18:14.721 "read": true, 00:18:14.721 "write": true, 00:18:14.721 "unmap": true, 00:18:14.721 "flush": true, 00:18:14.721 "reset": true, 00:18:14.721 "nvme_admin": false, 00:18:14.721 "nvme_io": false, 00:18:14.721 "nvme_io_md": false, 00:18:14.721 "write_zeroes": true, 00:18:14.721 "zcopy": true, 00:18:14.721 "get_zone_info": false, 00:18:14.721 "zone_management": false, 00:18:14.721 "zone_append": false, 00:18:14.721 "compare": false, 00:18:14.721 "compare_and_write": false, 00:18:14.721 "abort": true, 00:18:14.721 "seek_hole": false, 00:18:14.721 "seek_data": false, 00:18:14.721 "copy": true, 00:18:14.721 "nvme_iov_md": false 00:18:14.721 }, 00:18:14.721 "memory_domains": [ 00:18:14.721 { 00:18:14.721 "dma_device_id": "system", 00:18:14.721 "dma_device_type": 1 00:18:14.721 }, 00:18:14.721 { 00:18:14.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.721 "dma_device_type": 2 00:18:14.721 } 00:18:14.721 ], 00:18:14.721 "driver_specific": {} 00:18:14.721 } 00:18:14.721 ] 00:18:14.721 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.722 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.981 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.981 "name": "Existed_Raid", 00:18:14.981 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:14.981 "strip_size_kb": 64, 00:18:14.981 "state": "configuring", 00:18:14.981 "raid_level": "raid0", 00:18:14.981 "superblock": true, 00:18:14.981 "num_base_bdevs": 4, 00:18:14.981 "num_base_bdevs_discovered": 2, 00:18:14.981 "num_base_bdevs_operational": 4, 00:18:14.981 "base_bdevs_list": [ 00:18:14.981 { 00:18:14.981 "name": "BaseBdev1", 00:18:14.981 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:14.981 "is_configured": true, 00:18:14.981 "data_offset": 2048, 00:18:14.981 "data_size": 63488 00:18:14.981 }, 00:18:14.981 { 00:18:14.981 "name": "BaseBdev2", 00:18:14.981 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:14.981 "is_configured": true, 00:18:14.981 "data_offset": 2048, 00:18:14.981 "data_size": 63488 00:18:14.981 }, 00:18:14.981 { 00:18:14.981 "name": "BaseBdev3", 00:18:14.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.981 "is_configured": false, 00:18:14.981 "data_offset": 0, 00:18:14.981 "data_size": 0 00:18:14.981 }, 00:18:14.981 { 00:18:14.981 "name": "BaseBdev4", 00:18:14.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.981 "is_configured": false, 00:18:14.981 "data_offset": 0, 00:18:14.981 "data_size": 0 00:18:14.981 } 00:18:14.981 ] 00:18:14.981 }' 00:18:14.981 09:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.981 09:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:15.548 09:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:15.806 [2024-07-15 09:22:24.616567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:15.806 BaseBdev3 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:15.806 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.065 09:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:16.324 [ 00:18:16.324 { 00:18:16.324 "name": "BaseBdev3", 00:18:16.324 "aliases": [ 00:18:16.324 "f2e21c72-c833-473b-8fba-1c3284580b8d" 00:18:16.324 ], 00:18:16.324 "product_name": "Malloc disk", 00:18:16.324 "block_size": 512, 00:18:16.324 "num_blocks": 65536, 00:18:16.324 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:16.324 "assigned_rate_limits": { 00:18:16.324 "rw_ios_per_sec": 0, 00:18:16.324 "rw_mbytes_per_sec": 0, 00:18:16.324 "r_mbytes_per_sec": 0, 00:18:16.324 "w_mbytes_per_sec": 0 00:18:16.324 }, 00:18:16.324 "claimed": true, 00:18:16.324 "claim_type": "exclusive_write", 00:18:16.324 "zoned": false, 00:18:16.324 "supported_io_types": { 00:18:16.324 "read": true, 00:18:16.324 "write": true, 00:18:16.324 "unmap": true, 00:18:16.324 "flush": true, 00:18:16.324 "reset": true, 00:18:16.324 "nvme_admin": false, 00:18:16.324 "nvme_io": false, 00:18:16.324 "nvme_io_md": false, 00:18:16.324 "write_zeroes": true, 00:18:16.324 "zcopy": true, 00:18:16.324 "get_zone_info": false, 00:18:16.324 "zone_management": false, 00:18:16.324 "zone_append": false, 00:18:16.324 "compare": false, 00:18:16.324 "compare_and_write": false, 00:18:16.324 "abort": true, 00:18:16.324 "seek_hole": false, 00:18:16.324 "seek_data": false, 00:18:16.324 "copy": true, 00:18:16.324 "nvme_iov_md": false 00:18:16.324 }, 00:18:16.324 "memory_domains": [ 00:18:16.324 { 00:18:16.324 "dma_device_id": "system", 00:18:16.324 "dma_device_type": 1 00:18:16.324 }, 00:18:16.324 { 00:18:16.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.324 "dma_device_type": 2 00:18:16.324 } 00:18:16.324 ], 00:18:16.324 "driver_specific": {} 00:18:16.324 } 00:18:16.324 ] 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.324 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.583 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.583 "name": "Existed_Raid", 00:18:16.583 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:16.583 "strip_size_kb": 64, 00:18:16.583 "state": "configuring", 00:18:16.583 "raid_level": "raid0", 00:18:16.583 "superblock": true, 00:18:16.583 "num_base_bdevs": 4, 00:18:16.583 "num_base_bdevs_discovered": 3, 00:18:16.583 "num_base_bdevs_operational": 4, 00:18:16.583 "base_bdevs_list": [ 00:18:16.583 { 00:18:16.583 "name": "BaseBdev1", 00:18:16.583 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:16.583 "is_configured": true, 00:18:16.583 "data_offset": 2048, 00:18:16.583 "data_size": 63488 00:18:16.583 }, 00:18:16.583 { 00:18:16.583 "name": "BaseBdev2", 00:18:16.583 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:16.583 "is_configured": true, 00:18:16.583 "data_offset": 2048, 00:18:16.583 "data_size": 63488 00:18:16.583 }, 00:18:16.583 { 00:18:16.583 "name": "BaseBdev3", 00:18:16.583 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:16.583 "is_configured": true, 00:18:16.583 "data_offset": 2048, 00:18:16.583 "data_size": 63488 00:18:16.583 }, 00:18:16.583 { 00:18:16.583 "name": "BaseBdev4", 00:18:16.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:16.583 "is_configured": false, 00:18:16.583 "data_offset": 0, 00:18:16.583 "data_size": 0 00:18:16.583 } 00:18:16.583 ] 00:18:16.583 }' 00:18:16.583 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.583 09:22:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.150 09:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:17.409 [2024-07-15 09:22:26.212201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:17.409 [2024-07-15 09:22:26.212376] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc9350 00:18:17.409 [2024-07-15 09:22:26.212390] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:17.409 [2024-07-15 09:22:26.212567] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc9020 00:18:17.409 [2024-07-15 09:22:26.212686] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc9350 00:18:17.409 [2024-07-15 09:22:26.212696] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfc9350 00:18:17.409 [2024-07-15 09:22:26.212785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:17.409 BaseBdev4 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:17.409 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:17.668 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:17.928 [ 00:18:17.928 { 00:18:17.928 "name": "BaseBdev4", 00:18:17.928 "aliases": [ 00:18:17.928 "fff44a0e-230a-44eb-a77c-acb43de83e08" 00:18:17.928 ], 00:18:17.928 "product_name": "Malloc disk", 00:18:17.928 "block_size": 512, 00:18:17.928 "num_blocks": 65536, 00:18:17.928 "uuid": "fff44a0e-230a-44eb-a77c-acb43de83e08", 00:18:17.928 "assigned_rate_limits": { 00:18:17.928 "rw_ios_per_sec": 0, 00:18:17.928 "rw_mbytes_per_sec": 0, 00:18:17.928 "r_mbytes_per_sec": 0, 00:18:17.928 "w_mbytes_per_sec": 0 00:18:17.928 }, 00:18:17.928 "claimed": true, 00:18:17.928 "claim_type": "exclusive_write", 00:18:17.928 "zoned": false, 00:18:17.928 "supported_io_types": { 00:18:17.928 "read": true, 00:18:17.928 "write": true, 00:18:17.928 "unmap": true, 00:18:17.928 "flush": true, 00:18:17.928 "reset": true, 00:18:17.928 "nvme_admin": false, 00:18:17.928 "nvme_io": false, 00:18:17.928 "nvme_io_md": false, 00:18:17.928 "write_zeroes": true, 00:18:17.928 "zcopy": true, 00:18:17.928 "get_zone_info": false, 00:18:17.928 "zone_management": false, 00:18:17.928 "zone_append": false, 00:18:17.928 "compare": false, 00:18:17.928 "compare_and_write": false, 00:18:17.928 "abort": true, 00:18:17.928 "seek_hole": false, 00:18:17.928 "seek_data": false, 00:18:17.928 "copy": true, 00:18:17.928 "nvme_iov_md": false 00:18:17.928 }, 00:18:17.928 "memory_domains": [ 00:18:17.928 { 00:18:17.928 "dma_device_id": "system", 00:18:17.928 "dma_device_type": 1 00:18:17.928 }, 00:18:17.928 { 00:18:17.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.928 "dma_device_type": 2 00:18:17.928 } 00:18:17.928 ], 00:18:17.928 "driver_specific": {} 00:18:17.928 } 00:18:17.928 ] 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.928 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:18.188 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.188 "name": "Existed_Raid", 00:18:18.188 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:18.188 "strip_size_kb": 64, 00:18:18.188 "state": "online", 00:18:18.188 "raid_level": "raid0", 00:18:18.188 "superblock": true, 00:18:18.188 "num_base_bdevs": 4, 00:18:18.188 "num_base_bdevs_discovered": 4, 00:18:18.188 "num_base_bdevs_operational": 4, 00:18:18.188 "base_bdevs_list": [ 00:18:18.188 { 00:18:18.188 "name": "BaseBdev1", 00:18:18.188 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:18.188 "is_configured": true, 00:18:18.188 "data_offset": 2048, 00:18:18.188 "data_size": 63488 00:18:18.188 }, 00:18:18.188 { 00:18:18.188 "name": "BaseBdev2", 00:18:18.188 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:18.188 "is_configured": true, 00:18:18.188 "data_offset": 2048, 00:18:18.188 "data_size": 63488 00:18:18.188 }, 00:18:18.188 { 00:18:18.188 "name": "BaseBdev3", 00:18:18.188 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:18.188 "is_configured": true, 00:18:18.188 "data_offset": 2048, 00:18:18.188 "data_size": 63488 00:18:18.188 }, 00:18:18.188 { 00:18:18.188 "name": "BaseBdev4", 00:18:18.188 "uuid": "fff44a0e-230a-44eb-a77c-acb43de83e08", 00:18:18.188 "is_configured": true, 00:18:18.188 "data_offset": 2048, 00:18:18.188 "data_size": 63488 00:18:18.188 } 00:18:18.188 ] 00:18:18.188 }' 00:18:18.188 09:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.188 09:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:18.756 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:19.014 [2024-07-15 09:22:27.780703] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:19.014 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:19.014 "name": "Existed_Raid", 00:18:19.014 "aliases": [ 00:18:19.014 "4d7e9133-b858-484a-a125-cbb2357397c5" 00:18:19.014 ], 00:18:19.014 "product_name": "Raid Volume", 00:18:19.014 "block_size": 512, 00:18:19.014 "num_blocks": 253952, 00:18:19.014 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:19.014 "assigned_rate_limits": { 00:18:19.014 "rw_ios_per_sec": 0, 00:18:19.014 "rw_mbytes_per_sec": 0, 00:18:19.014 "r_mbytes_per_sec": 0, 00:18:19.014 "w_mbytes_per_sec": 0 00:18:19.014 }, 00:18:19.014 "claimed": false, 00:18:19.014 "zoned": false, 00:18:19.014 "supported_io_types": { 00:18:19.014 "read": true, 00:18:19.014 "write": true, 00:18:19.014 "unmap": true, 00:18:19.014 "flush": true, 00:18:19.014 "reset": true, 00:18:19.014 "nvme_admin": false, 00:18:19.014 "nvme_io": false, 00:18:19.015 "nvme_io_md": false, 00:18:19.015 "write_zeroes": true, 00:18:19.015 "zcopy": false, 00:18:19.015 "get_zone_info": false, 00:18:19.015 "zone_management": false, 00:18:19.015 "zone_append": false, 00:18:19.015 "compare": false, 00:18:19.015 "compare_and_write": false, 00:18:19.015 "abort": false, 00:18:19.015 "seek_hole": false, 00:18:19.015 "seek_data": false, 00:18:19.015 "copy": false, 00:18:19.015 "nvme_iov_md": false 00:18:19.015 }, 00:18:19.015 "memory_domains": [ 00:18:19.015 { 00:18:19.015 "dma_device_id": "system", 00:18:19.015 "dma_device_type": 1 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.015 "dma_device_type": 2 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "system", 00:18:19.015 "dma_device_type": 1 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.015 "dma_device_type": 2 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "system", 00:18:19.015 "dma_device_type": 1 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.015 "dma_device_type": 2 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "system", 00:18:19.015 "dma_device_type": 1 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.015 "dma_device_type": 2 00:18:19.015 } 00:18:19.015 ], 00:18:19.015 "driver_specific": { 00:18:19.015 "raid": { 00:18:19.015 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:19.015 "strip_size_kb": 64, 00:18:19.015 "state": "online", 00:18:19.015 "raid_level": "raid0", 00:18:19.015 "superblock": true, 00:18:19.015 "num_base_bdevs": 4, 00:18:19.015 "num_base_bdevs_discovered": 4, 00:18:19.015 "num_base_bdevs_operational": 4, 00:18:19.015 "base_bdevs_list": [ 00:18:19.015 { 00:18:19.015 "name": "BaseBdev1", 00:18:19.015 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:19.015 "is_configured": true, 00:18:19.015 "data_offset": 2048, 00:18:19.015 "data_size": 63488 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "name": "BaseBdev2", 00:18:19.015 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:19.015 "is_configured": true, 00:18:19.015 "data_offset": 2048, 00:18:19.015 "data_size": 63488 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "name": "BaseBdev3", 00:18:19.015 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:19.015 "is_configured": true, 00:18:19.015 "data_offset": 2048, 00:18:19.015 "data_size": 63488 00:18:19.015 }, 00:18:19.015 { 00:18:19.015 "name": "BaseBdev4", 00:18:19.015 "uuid": "fff44a0e-230a-44eb-a77c-acb43de83e08", 00:18:19.015 "is_configured": true, 00:18:19.015 "data_offset": 2048, 00:18:19.015 "data_size": 63488 00:18:19.015 } 00:18:19.015 ] 00:18:19.015 } 00:18:19.015 } 00:18:19.015 }' 00:18:19.015 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:19.015 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:19.015 BaseBdev2 00:18:19.015 BaseBdev3 00:18:19.015 BaseBdev4' 00:18:19.015 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.015 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.015 09:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.273 "name": "BaseBdev1", 00:18:19.273 "aliases": [ 00:18:19.273 "f2490c29-f9c9-4b00-998b-2d21a456e622" 00:18:19.273 ], 00:18:19.273 "product_name": "Malloc disk", 00:18:19.273 "block_size": 512, 00:18:19.273 "num_blocks": 65536, 00:18:19.273 "uuid": "f2490c29-f9c9-4b00-998b-2d21a456e622", 00:18:19.273 "assigned_rate_limits": { 00:18:19.273 "rw_ios_per_sec": 0, 00:18:19.273 "rw_mbytes_per_sec": 0, 00:18:19.273 "r_mbytes_per_sec": 0, 00:18:19.273 "w_mbytes_per_sec": 0 00:18:19.273 }, 00:18:19.273 "claimed": true, 00:18:19.273 "claim_type": "exclusive_write", 00:18:19.273 "zoned": false, 00:18:19.273 "supported_io_types": { 00:18:19.273 "read": true, 00:18:19.273 "write": true, 00:18:19.273 "unmap": true, 00:18:19.273 "flush": true, 00:18:19.273 "reset": true, 00:18:19.273 "nvme_admin": false, 00:18:19.273 "nvme_io": false, 00:18:19.273 "nvme_io_md": false, 00:18:19.273 "write_zeroes": true, 00:18:19.273 "zcopy": true, 00:18:19.273 "get_zone_info": false, 00:18:19.273 "zone_management": false, 00:18:19.273 "zone_append": false, 00:18:19.273 "compare": false, 00:18:19.273 "compare_and_write": false, 00:18:19.273 "abort": true, 00:18:19.273 "seek_hole": false, 00:18:19.273 "seek_data": false, 00:18:19.273 "copy": true, 00:18:19.273 "nvme_iov_md": false 00:18:19.273 }, 00:18:19.273 "memory_domains": [ 00:18:19.273 { 00:18:19.273 "dma_device_id": "system", 00:18:19.273 "dma_device_type": 1 00:18:19.273 }, 00:18:19.273 { 00:18:19.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.273 "dma_device_type": 2 00:18:19.273 } 00:18:19.273 ], 00:18:19.273 "driver_specific": {} 00:18:19.273 }' 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.273 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:19.531 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.788 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.788 "name": "BaseBdev2", 00:18:19.788 "aliases": [ 00:18:19.788 "59b81aa8-7f61-44fb-9729-bc08e143c59e" 00:18:19.788 ], 00:18:19.788 "product_name": "Malloc disk", 00:18:19.788 "block_size": 512, 00:18:19.788 "num_blocks": 65536, 00:18:19.788 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:19.788 "assigned_rate_limits": { 00:18:19.788 "rw_ios_per_sec": 0, 00:18:19.788 "rw_mbytes_per_sec": 0, 00:18:19.788 "r_mbytes_per_sec": 0, 00:18:19.788 "w_mbytes_per_sec": 0 00:18:19.788 }, 00:18:19.788 "claimed": true, 00:18:19.788 "claim_type": "exclusive_write", 00:18:19.788 "zoned": false, 00:18:19.788 "supported_io_types": { 00:18:19.788 "read": true, 00:18:19.788 "write": true, 00:18:19.788 "unmap": true, 00:18:19.788 "flush": true, 00:18:19.788 "reset": true, 00:18:19.788 "nvme_admin": false, 00:18:19.788 "nvme_io": false, 00:18:19.788 "nvme_io_md": false, 00:18:19.788 "write_zeroes": true, 00:18:19.788 "zcopy": true, 00:18:19.788 "get_zone_info": false, 00:18:19.788 "zone_management": false, 00:18:19.788 "zone_append": false, 00:18:19.788 "compare": false, 00:18:19.788 "compare_and_write": false, 00:18:19.788 "abort": true, 00:18:19.788 "seek_hole": false, 00:18:19.788 "seek_data": false, 00:18:19.788 "copy": true, 00:18:19.788 "nvme_iov_md": false 00:18:19.788 }, 00:18:19.788 "memory_domains": [ 00:18:19.788 { 00:18:19.788 "dma_device_id": "system", 00:18:19.788 "dma_device_type": 1 00:18:19.788 }, 00:18:19.788 { 00:18:19.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.788 "dma_device_type": 2 00:18:19.788 } 00:18:19.788 ], 00:18:19.788 "driver_specific": {} 00:18:19.788 }' 00:18:19.788 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.788 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.045 09:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.303 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.303 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.303 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:20.303 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:20.561 "name": "BaseBdev3", 00:18:20.561 "aliases": [ 00:18:20.561 "f2e21c72-c833-473b-8fba-1c3284580b8d" 00:18:20.561 ], 00:18:20.561 "product_name": "Malloc disk", 00:18:20.561 "block_size": 512, 00:18:20.561 "num_blocks": 65536, 00:18:20.561 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:20.561 "assigned_rate_limits": { 00:18:20.561 "rw_ios_per_sec": 0, 00:18:20.561 "rw_mbytes_per_sec": 0, 00:18:20.561 "r_mbytes_per_sec": 0, 00:18:20.561 "w_mbytes_per_sec": 0 00:18:20.561 }, 00:18:20.561 "claimed": true, 00:18:20.561 "claim_type": "exclusive_write", 00:18:20.561 "zoned": false, 00:18:20.561 "supported_io_types": { 00:18:20.561 "read": true, 00:18:20.561 "write": true, 00:18:20.561 "unmap": true, 00:18:20.561 "flush": true, 00:18:20.561 "reset": true, 00:18:20.561 "nvme_admin": false, 00:18:20.561 "nvme_io": false, 00:18:20.561 "nvme_io_md": false, 00:18:20.561 "write_zeroes": true, 00:18:20.561 "zcopy": true, 00:18:20.561 "get_zone_info": false, 00:18:20.561 "zone_management": false, 00:18:20.561 "zone_append": false, 00:18:20.561 "compare": false, 00:18:20.561 "compare_and_write": false, 00:18:20.561 "abort": true, 00:18:20.561 "seek_hole": false, 00:18:20.561 "seek_data": false, 00:18:20.561 "copy": true, 00:18:20.561 "nvme_iov_md": false 00:18:20.561 }, 00:18:20.561 "memory_domains": [ 00:18:20.561 { 00:18:20.561 "dma_device_id": "system", 00:18:20.561 "dma_device_type": 1 00:18:20.561 }, 00:18:20.561 { 00:18:20.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.561 "dma_device_type": 2 00:18:20.561 } 00:18:20.561 ], 00:18:20.561 "driver_specific": {} 00:18:20.561 }' 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.561 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.820 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.077 "name": "BaseBdev4", 00:18:21.077 "aliases": [ 00:18:21.077 "fff44a0e-230a-44eb-a77c-acb43de83e08" 00:18:21.077 ], 00:18:21.077 "product_name": "Malloc disk", 00:18:21.077 "block_size": 512, 00:18:21.077 "num_blocks": 65536, 00:18:21.077 "uuid": "fff44a0e-230a-44eb-a77c-acb43de83e08", 00:18:21.077 "assigned_rate_limits": { 00:18:21.077 "rw_ios_per_sec": 0, 00:18:21.077 "rw_mbytes_per_sec": 0, 00:18:21.077 "r_mbytes_per_sec": 0, 00:18:21.077 "w_mbytes_per_sec": 0 00:18:21.077 }, 00:18:21.077 "claimed": true, 00:18:21.077 "claim_type": "exclusive_write", 00:18:21.077 "zoned": false, 00:18:21.077 "supported_io_types": { 00:18:21.077 "read": true, 00:18:21.077 "write": true, 00:18:21.077 "unmap": true, 00:18:21.077 "flush": true, 00:18:21.077 "reset": true, 00:18:21.077 "nvme_admin": false, 00:18:21.077 "nvme_io": false, 00:18:21.077 "nvme_io_md": false, 00:18:21.077 "write_zeroes": true, 00:18:21.077 "zcopy": true, 00:18:21.077 "get_zone_info": false, 00:18:21.077 "zone_management": false, 00:18:21.077 "zone_append": false, 00:18:21.077 "compare": false, 00:18:21.077 "compare_and_write": false, 00:18:21.077 "abort": true, 00:18:21.077 "seek_hole": false, 00:18:21.077 "seek_data": false, 00:18:21.077 "copy": true, 00:18:21.077 "nvme_iov_md": false 00:18:21.077 }, 00:18:21.077 "memory_domains": [ 00:18:21.077 { 00:18:21.077 "dma_device_id": "system", 00:18:21.077 "dma_device_type": 1 00:18:21.077 }, 00:18:21.077 { 00:18:21.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.077 "dma_device_type": 2 00:18:21.077 } 00:18:21.077 ], 00:18:21.077 "driver_specific": {} 00:18:21.077 }' 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.077 09:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.334 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:21.593 [2024-07-15 09:22:30.359284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:21.593 [2024-07-15 09:22:30.359312] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:21.593 [2024-07-15 09:22:30.359359] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.593 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.852 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.852 "name": "Existed_Raid", 00:18:21.852 "uuid": "4d7e9133-b858-484a-a125-cbb2357397c5", 00:18:21.852 "strip_size_kb": 64, 00:18:21.852 "state": "offline", 00:18:21.852 "raid_level": "raid0", 00:18:21.852 "superblock": true, 00:18:21.852 "num_base_bdevs": 4, 00:18:21.852 "num_base_bdevs_discovered": 3, 00:18:21.852 "num_base_bdevs_operational": 3, 00:18:21.852 "base_bdevs_list": [ 00:18:21.852 { 00:18:21.852 "name": null, 00:18:21.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.852 "is_configured": false, 00:18:21.852 "data_offset": 2048, 00:18:21.852 "data_size": 63488 00:18:21.852 }, 00:18:21.852 { 00:18:21.852 "name": "BaseBdev2", 00:18:21.852 "uuid": "59b81aa8-7f61-44fb-9729-bc08e143c59e", 00:18:21.852 "is_configured": true, 00:18:21.852 "data_offset": 2048, 00:18:21.852 "data_size": 63488 00:18:21.852 }, 00:18:21.852 { 00:18:21.852 "name": "BaseBdev3", 00:18:21.852 "uuid": "f2e21c72-c833-473b-8fba-1c3284580b8d", 00:18:21.852 "is_configured": true, 00:18:21.852 "data_offset": 2048, 00:18:21.852 "data_size": 63488 00:18:21.852 }, 00:18:21.852 { 00:18:21.852 "name": "BaseBdev4", 00:18:21.852 "uuid": "fff44a0e-230a-44eb-a77c-acb43de83e08", 00:18:21.852 "is_configured": true, 00:18:21.852 "data_offset": 2048, 00:18:21.852 "data_size": 63488 00:18:21.853 } 00:18:21.853 ] 00:18:21.853 }' 00:18:21.853 09:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.853 09:22:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.419 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:22.419 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.419 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.419 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:22.677 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:22.677 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:22.677 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:22.936 [2024-07-15 09:22:31.699899] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:22.936 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:22.936 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.936 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.936 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:23.195 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:23.195 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:23.195 09:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:23.453 [2024-07-15 09:22:32.195783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:23.453 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:23.453 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:23.453 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.453 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:23.712 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:23.712 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:23.712 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:23.971 [2024-07-15 09:22:32.699598] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:23.971 [2024-07-15 09:22:32.699644] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc9350 name Existed_Raid, state offline 00:18:23.971 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:23.971 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:23.971 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.971 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:24.229 09:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:24.486 BaseBdev2 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.486 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:24.744 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:24.744 [ 00:18:24.744 { 00:18:24.744 "name": "BaseBdev2", 00:18:24.744 "aliases": [ 00:18:24.744 "96bc10df-b22c-4324-9457-35803887b734" 00:18:24.744 ], 00:18:24.744 "product_name": "Malloc disk", 00:18:24.744 "block_size": 512, 00:18:24.744 "num_blocks": 65536, 00:18:24.744 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:24.744 "assigned_rate_limits": { 00:18:24.744 "rw_ios_per_sec": 0, 00:18:24.744 "rw_mbytes_per_sec": 0, 00:18:24.744 "r_mbytes_per_sec": 0, 00:18:24.744 "w_mbytes_per_sec": 0 00:18:24.744 }, 00:18:24.744 "claimed": false, 00:18:24.744 "zoned": false, 00:18:24.744 "supported_io_types": { 00:18:24.744 "read": true, 00:18:24.744 "write": true, 00:18:24.744 "unmap": true, 00:18:24.744 "flush": true, 00:18:24.744 "reset": true, 00:18:24.744 "nvme_admin": false, 00:18:24.744 "nvme_io": false, 00:18:24.744 "nvme_io_md": false, 00:18:24.744 "write_zeroes": true, 00:18:24.744 "zcopy": true, 00:18:24.744 "get_zone_info": false, 00:18:24.744 "zone_management": false, 00:18:24.744 "zone_append": false, 00:18:24.744 "compare": false, 00:18:24.744 "compare_and_write": false, 00:18:24.744 "abort": true, 00:18:24.744 "seek_hole": false, 00:18:24.744 "seek_data": false, 00:18:24.744 "copy": true, 00:18:24.744 "nvme_iov_md": false 00:18:24.744 }, 00:18:24.744 "memory_domains": [ 00:18:24.744 { 00:18:24.744 "dma_device_id": "system", 00:18:24.744 "dma_device_type": 1 00:18:24.744 }, 00:18:24.744 { 00:18:24.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.744 "dma_device_type": 2 00:18:24.744 } 00:18:24.744 ], 00:18:24.744 "driver_specific": {} 00:18:24.744 } 00:18:24.744 ] 00:18:24.744 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:24.744 09:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:25.002 BaseBdev3 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:25.002 09:22:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.259 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:25.518 [ 00:18:25.518 { 00:18:25.518 "name": "BaseBdev3", 00:18:25.518 "aliases": [ 00:18:25.518 "ae6965ba-40a1-422d-8dc0-d9bd64afa080" 00:18:25.518 ], 00:18:25.518 "product_name": "Malloc disk", 00:18:25.518 "block_size": 512, 00:18:25.518 "num_blocks": 65536, 00:18:25.518 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:25.518 "assigned_rate_limits": { 00:18:25.518 "rw_ios_per_sec": 0, 00:18:25.518 "rw_mbytes_per_sec": 0, 00:18:25.518 "r_mbytes_per_sec": 0, 00:18:25.518 "w_mbytes_per_sec": 0 00:18:25.518 }, 00:18:25.518 "claimed": false, 00:18:25.518 "zoned": false, 00:18:25.518 "supported_io_types": { 00:18:25.518 "read": true, 00:18:25.518 "write": true, 00:18:25.518 "unmap": true, 00:18:25.518 "flush": true, 00:18:25.518 "reset": true, 00:18:25.518 "nvme_admin": false, 00:18:25.518 "nvme_io": false, 00:18:25.518 "nvme_io_md": false, 00:18:25.518 "write_zeroes": true, 00:18:25.518 "zcopy": true, 00:18:25.518 "get_zone_info": false, 00:18:25.518 "zone_management": false, 00:18:25.518 "zone_append": false, 00:18:25.518 "compare": false, 00:18:25.518 "compare_and_write": false, 00:18:25.518 "abort": true, 00:18:25.518 "seek_hole": false, 00:18:25.518 "seek_data": false, 00:18:25.518 "copy": true, 00:18:25.518 "nvme_iov_md": false 00:18:25.518 }, 00:18:25.518 "memory_domains": [ 00:18:25.518 { 00:18:25.518 "dma_device_id": "system", 00:18:25.518 "dma_device_type": 1 00:18:25.518 }, 00:18:25.518 { 00:18:25.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.518 "dma_device_type": 2 00:18:25.518 } 00:18:25.518 ], 00:18:25.518 "driver_specific": {} 00:18:25.518 } 00:18:25.518 ] 00:18:25.518 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:25.518 09:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:25.518 09:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:25.518 09:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:25.775 BaseBdev4 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:25.775 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.033 09:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:26.291 [ 00:18:26.291 { 00:18:26.291 "name": "BaseBdev4", 00:18:26.291 "aliases": [ 00:18:26.291 "8b275add-acf2-49b3-8e82-c8dc274ee577" 00:18:26.291 ], 00:18:26.291 "product_name": "Malloc disk", 00:18:26.291 "block_size": 512, 00:18:26.291 "num_blocks": 65536, 00:18:26.291 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:26.291 "assigned_rate_limits": { 00:18:26.291 "rw_ios_per_sec": 0, 00:18:26.291 "rw_mbytes_per_sec": 0, 00:18:26.291 "r_mbytes_per_sec": 0, 00:18:26.291 "w_mbytes_per_sec": 0 00:18:26.291 }, 00:18:26.291 "claimed": false, 00:18:26.291 "zoned": false, 00:18:26.291 "supported_io_types": { 00:18:26.291 "read": true, 00:18:26.291 "write": true, 00:18:26.291 "unmap": true, 00:18:26.291 "flush": true, 00:18:26.291 "reset": true, 00:18:26.291 "nvme_admin": false, 00:18:26.291 "nvme_io": false, 00:18:26.291 "nvme_io_md": false, 00:18:26.291 "write_zeroes": true, 00:18:26.291 "zcopy": true, 00:18:26.291 "get_zone_info": false, 00:18:26.291 "zone_management": false, 00:18:26.291 "zone_append": false, 00:18:26.291 "compare": false, 00:18:26.291 "compare_and_write": false, 00:18:26.291 "abort": true, 00:18:26.291 "seek_hole": false, 00:18:26.291 "seek_data": false, 00:18:26.291 "copy": true, 00:18:26.291 "nvme_iov_md": false 00:18:26.291 }, 00:18:26.291 "memory_domains": [ 00:18:26.291 { 00:18:26.291 "dma_device_id": "system", 00:18:26.291 "dma_device_type": 1 00:18:26.291 }, 00:18:26.291 { 00:18:26.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.291 "dma_device_type": 2 00:18:26.291 } 00:18:26.291 ], 00:18:26.291 "driver_specific": {} 00:18:26.291 } 00:18:26.291 ] 00:18:26.291 09:22:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:26.291 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:26.291 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:26.291 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:26.548 [2024-07-15 09:22:35.313291] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:26.548 [2024-07-15 09:22:35.313333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:26.548 [2024-07-15 09:22:35.313354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:26.548 [2024-07-15 09:22:35.314684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:26.548 [2024-07-15 09:22:35.314726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.548 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.806 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.806 "name": "Existed_Raid", 00:18:26.806 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:26.806 "strip_size_kb": 64, 00:18:26.806 "state": "configuring", 00:18:26.806 "raid_level": "raid0", 00:18:26.806 "superblock": true, 00:18:26.806 "num_base_bdevs": 4, 00:18:26.806 "num_base_bdevs_discovered": 3, 00:18:26.806 "num_base_bdevs_operational": 4, 00:18:26.806 "base_bdevs_list": [ 00:18:26.806 { 00:18:26.806 "name": "BaseBdev1", 00:18:26.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.806 "is_configured": false, 00:18:26.806 "data_offset": 0, 00:18:26.806 "data_size": 0 00:18:26.806 }, 00:18:26.806 { 00:18:26.806 "name": "BaseBdev2", 00:18:26.806 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:26.806 "is_configured": true, 00:18:26.806 "data_offset": 2048, 00:18:26.806 "data_size": 63488 00:18:26.806 }, 00:18:26.806 { 00:18:26.806 "name": "BaseBdev3", 00:18:26.806 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:26.806 "is_configured": true, 00:18:26.806 "data_offset": 2048, 00:18:26.806 "data_size": 63488 00:18:26.806 }, 00:18:26.806 { 00:18:26.806 "name": "BaseBdev4", 00:18:26.806 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:26.806 "is_configured": true, 00:18:26.806 "data_offset": 2048, 00:18:26.806 "data_size": 63488 00:18:26.806 } 00:18:26.806 ] 00:18:26.806 }' 00:18:26.806 09:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.806 09:22:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.372 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:27.372 [2024-07-15 09:22:36.324034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.630 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.887 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.887 "name": "Existed_Raid", 00:18:27.887 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:27.887 "strip_size_kb": 64, 00:18:27.887 "state": "configuring", 00:18:27.887 "raid_level": "raid0", 00:18:27.887 "superblock": true, 00:18:27.887 "num_base_bdevs": 4, 00:18:27.887 "num_base_bdevs_discovered": 2, 00:18:27.887 "num_base_bdevs_operational": 4, 00:18:27.887 "base_bdevs_list": [ 00:18:27.887 { 00:18:27.887 "name": "BaseBdev1", 00:18:27.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.887 "is_configured": false, 00:18:27.887 "data_offset": 0, 00:18:27.887 "data_size": 0 00:18:27.887 }, 00:18:27.887 { 00:18:27.887 "name": null, 00:18:27.887 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:27.888 "is_configured": false, 00:18:27.888 "data_offset": 2048, 00:18:27.888 "data_size": 63488 00:18:27.888 }, 00:18:27.888 { 00:18:27.888 "name": "BaseBdev3", 00:18:27.888 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:27.888 "is_configured": true, 00:18:27.888 "data_offset": 2048, 00:18:27.888 "data_size": 63488 00:18:27.888 }, 00:18:27.888 { 00:18:27.888 "name": "BaseBdev4", 00:18:27.888 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:27.888 "is_configured": true, 00:18:27.888 "data_offset": 2048, 00:18:27.888 "data_size": 63488 00:18:27.888 } 00:18:27.888 ] 00:18:27.888 }' 00:18:27.888 09:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.888 09:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.474 09:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.474 09:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:28.767 [2024-07-15 09:22:37.658965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:28.767 BaseBdev1 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:28.767 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:29.024 09:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:29.282 [ 00:18:29.282 { 00:18:29.282 "name": "BaseBdev1", 00:18:29.282 "aliases": [ 00:18:29.282 "b729c00b-45f1-4595-81ce-51107e6a9192" 00:18:29.282 ], 00:18:29.282 "product_name": "Malloc disk", 00:18:29.282 "block_size": 512, 00:18:29.282 "num_blocks": 65536, 00:18:29.282 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:29.282 "assigned_rate_limits": { 00:18:29.282 "rw_ios_per_sec": 0, 00:18:29.282 "rw_mbytes_per_sec": 0, 00:18:29.282 "r_mbytes_per_sec": 0, 00:18:29.282 "w_mbytes_per_sec": 0 00:18:29.282 }, 00:18:29.282 "claimed": true, 00:18:29.282 "claim_type": "exclusive_write", 00:18:29.282 "zoned": false, 00:18:29.282 "supported_io_types": { 00:18:29.282 "read": true, 00:18:29.282 "write": true, 00:18:29.282 "unmap": true, 00:18:29.282 "flush": true, 00:18:29.282 "reset": true, 00:18:29.282 "nvme_admin": false, 00:18:29.282 "nvme_io": false, 00:18:29.282 "nvme_io_md": false, 00:18:29.282 "write_zeroes": true, 00:18:29.282 "zcopy": true, 00:18:29.282 "get_zone_info": false, 00:18:29.282 "zone_management": false, 00:18:29.282 "zone_append": false, 00:18:29.282 "compare": false, 00:18:29.282 "compare_and_write": false, 00:18:29.282 "abort": true, 00:18:29.282 "seek_hole": false, 00:18:29.282 "seek_data": false, 00:18:29.282 "copy": true, 00:18:29.282 "nvme_iov_md": false 00:18:29.282 }, 00:18:29.282 "memory_domains": [ 00:18:29.282 { 00:18:29.282 "dma_device_id": "system", 00:18:29.282 "dma_device_type": 1 00:18:29.282 }, 00:18:29.282 { 00:18:29.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.282 "dma_device_type": 2 00:18:29.282 } 00:18:29.282 ], 00:18:29.282 "driver_specific": {} 00:18:29.282 } 00:18:29.282 ] 00:18:29.282 09:22:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:29.282 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.282 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.283 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.540 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.540 "name": "Existed_Raid", 00:18:29.540 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:29.541 "strip_size_kb": 64, 00:18:29.541 "state": "configuring", 00:18:29.541 "raid_level": "raid0", 00:18:29.541 "superblock": true, 00:18:29.541 "num_base_bdevs": 4, 00:18:29.541 "num_base_bdevs_discovered": 3, 00:18:29.541 "num_base_bdevs_operational": 4, 00:18:29.541 "base_bdevs_list": [ 00:18:29.541 { 00:18:29.541 "name": "BaseBdev1", 00:18:29.541 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:29.541 "is_configured": true, 00:18:29.541 "data_offset": 2048, 00:18:29.541 "data_size": 63488 00:18:29.541 }, 00:18:29.541 { 00:18:29.541 "name": null, 00:18:29.541 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:29.541 "is_configured": false, 00:18:29.541 "data_offset": 2048, 00:18:29.541 "data_size": 63488 00:18:29.541 }, 00:18:29.541 { 00:18:29.541 "name": "BaseBdev3", 00:18:29.541 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:29.541 "is_configured": true, 00:18:29.541 "data_offset": 2048, 00:18:29.541 "data_size": 63488 00:18:29.541 }, 00:18:29.541 { 00:18:29.541 "name": "BaseBdev4", 00:18:29.541 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:29.541 "is_configured": true, 00:18:29.541 "data_offset": 2048, 00:18:29.541 "data_size": 63488 00:18:29.541 } 00:18:29.541 ] 00:18:29.541 }' 00:18:29.541 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.541 09:22:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.107 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.107 09:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:30.365 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:30.365 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:30.623 [2024-07-15 09:22:39.387586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:30.623 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.623 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.623 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.623 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.624 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.882 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.882 "name": "Existed_Raid", 00:18:30.882 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:30.882 "strip_size_kb": 64, 00:18:30.882 "state": "configuring", 00:18:30.882 "raid_level": "raid0", 00:18:30.882 "superblock": true, 00:18:30.882 "num_base_bdevs": 4, 00:18:30.882 "num_base_bdevs_discovered": 2, 00:18:30.882 "num_base_bdevs_operational": 4, 00:18:30.882 "base_bdevs_list": [ 00:18:30.882 { 00:18:30.882 "name": "BaseBdev1", 00:18:30.882 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:30.882 "is_configured": true, 00:18:30.882 "data_offset": 2048, 00:18:30.882 "data_size": 63488 00:18:30.882 }, 00:18:30.882 { 00:18:30.882 "name": null, 00:18:30.882 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:30.882 "is_configured": false, 00:18:30.882 "data_offset": 2048, 00:18:30.882 "data_size": 63488 00:18:30.882 }, 00:18:30.882 { 00:18:30.882 "name": null, 00:18:30.882 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:30.882 "is_configured": false, 00:18:30.882 "data_offset": 2048, 00:18:30.882 "data_size": 63488 00:18:30.882 }, 00:18:30.882 { 00:18:30.882 "name": "BaseBdev4", 00:18:30.882 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:30.882 "is_configured": true, 00:18:30.882 "data_offset": 2048, 00:18:30.882 "data_size": 63488 00:18:30.882 } 00:18:30.882 ] 00:18:30.882 }' 00:18:30.882 09:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.882 09:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.448 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.448 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:31.706 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:31.706 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:31.964 [2024-07-15 09:22:40.715154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.964 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.223 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.223 "name": "Existed_Raid", 00:18:32.223 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:32.223 "strip_size_kb": 64, 00:18:32.223 "state": "configuring", 00:18:32.223 "raid_level": "raid0", 00:18:32.223 "superblock": true, 00:18:32.223 "num_base_bdevs": 4, 00:18:32.223 "num_base_bdevs_discovered": 3, 00:18:32.223 "num_base_bdevs_operational": 4, 00:18:32.223 "base_bdevs_list": [ 00:18:32.223 { 00:18:32.223 "name": "BaseBdev1", 00:18:32.223 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:32.223 "is_configured": true, 00:18:32.223 "data_offset": 2048, 00:18:32.223 "data_size": 63488 00:18:32.223 }, 00:18:32.223 { 00:18:32.223 "name": null, 00:18:32.223 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:32.223 "is_configured": false, 00:18:32.223 "data_offset": 2048, 00:18:32.223 "data_size": 63488 00:18:32.223 }, 00:18:32.223 { 00:18:32.223 "name": "BaseBdev3", 00:18:32.223 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:32.223 "is_configured": true, 00:18:32.223 "data_offset": 2048, 00:18:32.223 "data_size": 63488 00:18:32.223 }, 00:18:32.223 { 00:18:32.223 "name": "BaseBdev4", 00:18:32.223 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:32.223 "is_configured": true, 00:18:32.223 "data_offset": 2048, 00:18:32.223 "data_size": 63488 00:18:32.223 } 00:18:32.223 ] 00:18:32.223 }' 00:18:32.223 09:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.223 09:22:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:32.789 09:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.789 09:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:33.047 09:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:33.047 09:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:33.306 [2024-07-15 09:22:42.030628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.306 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.564 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.564 "name": "Existed_Raid", 00:18:33.564 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:33.564 "strip_size_kb": 64, 00:18:33.564 "state": "configuring", 00:18:33.564 "raid_level": "raid0", 00:18:33.564 "superblock": true, 00:18:33.564 "num_base_bdevs": 4, 00:18:33.564 "num_base_bdevs_discovered": 2, 00:18:33.565 "num_base_bdevs_operational": 4, 00:18:33.565 "base_bdevs_list": [ 00:18:33.565 { 00:18:33.565 "name": null, 00:18:33.565 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:33.565 "is_configured": false, 00:18:33.565 "data_offset": 2048, 00:18:33.565 "data_size": 63488 00:18:33.565 }, 00:18:33.565 { 00:18:33.565 "name": null, 00:18:33.565 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:33.565 "is_configured": false, 00:18:33.565 "data_offset": 2048, 00:18:33.565 "data_size": 63488 00:18:33.565 }, 00:18:33.565 { 00:18:33.565 "name": "BaseBdev3", 00:18:33.565 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:33.565 "is_configured": true, 00:18:33.565 "data_offset": 2048, 00:18:33.565 "data_size": 63488 00:18:33.565 }, 00:18:33.565 { 00:18:33.565 "name": "BaseBdev4", 00:18:33.565 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:33.565 "is_configured": true, 00:18:33.565 "data_offset": 2048, 00:18:33.565 "data_size": 63488 00:18:33.565 } 00:18:33.565 ] 00:18:33.565 }' 00:18:33.565 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.565 09:22:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.132 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.132 09:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:34.391 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:34.391 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:34.651 [2024-07-15 09:22:43.344831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.651 "name": "Existed_Raid", 00:18:34.651 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:34.651 "strip_size_kb": 64, 00:18:34.651 "state": "configuring", 00:18:34.651 "raid_level": "raid0", 00:18:34.651 "superblock": true, 00:18:34.651 "num_base_bdevs": 4, 00:18:34.651 "num_base_bdevs_discovered": 3, 00:18:34.651 "num_base_bdevs_operational": 4, 00:18:34.651 "base_bdevs_list": [ 00:18:34.651 { 00:18:34.651 "name": null, 00:18:34.651 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:34.651 "is_configured": false, 00:18:34.651 "data_offset": 2048, 00:18:34.651 "data_size": 63488 00:18:34.651 }, 00:18:34.651 { 00:18:34.651 "name": "BaseBdev2", 00:18:34.651 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:34.651 "is_configured": true, 00:18:34.651 "data_offset": 2048, 00:18:34.651 "data_size": 63488 00:18:34.651 }, 00:18:34.651 { 00:18:34.651 "name": "BaseBdev3", 00:18:34.651 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:34.651 "is_configured": true, 00:18:34.651 "data_offset": 2048, 00:18:34.651 "data_size": 63488 00:18:34.651 }, 00:18:34.651 { 00:18:34.651 "name": "BaseBdev4", 00:18:34.651 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:34.651 "is_configured": true, 00:18:34.651 "data_offset": 2048, 00:18:34.651 "data_size": 63488 00:18:34.651 } 00:18:34.651 ] 00:18:34.651 }' 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.651 09:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.218 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.218 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:35.786 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:35.786 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.786 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:36.045 09:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b729c00b-45f1-4595-81ce-51107e6a9192 00:18:36.304 [2024-07-15 09:22:45.021815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:36.304 [2024-07-15 09:22:45.021995] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfcf470 00:18:36.304 [2024-07-15 09:22:45.022009] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:36.304 [2024-07-15 09:22:45.022191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbfc40 00:18:36.304 [2024-07-15 09:22:45.022309] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfcf470 00:18:36.305 [2024-07-15 09:22:45.022319] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfcf470 00:18:36.305 [2024-07-15 09:22:45.022414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:36.305 NewBaseBdev 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:36.305 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.564 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:36.564 [ 00:18:36.564 { 00:18:36.564 "name": "NewBaseBdev", 00:18:36.564 "aliases": [ 00:18:36.564 "b729c00b-45f1-4595-81ce-51107e6a9192" 00:18:36.564 ], 00:18:36.564 "product_name": "Malloc disk", 00:18:36.564 "block_size": 512, 00:18:36.564 "num_blocks": 65536, 00:18:36.564 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:36.564 "assigned_rate_limits": { 00:18:36.564 "rw_ios_per_sec": 0, 00:18:36.564 "rw_mbytes_per_sec": 0, 00:18:36.564 "r_mbytes_per_sec": 0, 00:18:36.564 "w_mbytes_per_sec": 0 00:18:36.564 }, 00:18:36.564 "claimed": true, 00:18:36.564 "claim_type": "exclusive_write", 00:18:36.564 "zoned": false, 00:18:36.565 "supported_io_types": { 00:18:36.565 "read": true, 00:18:36.565 "write": true, 00:18:36.565 "unmap": true, 00:18:36.565 "flush": true, 00:18:36.565 "reset": true, 00:18:36.565 "nvme_admin": false, 00:18:36.565 "nvme_io": false, 00:18:36.565 "nvme_io_md": false, 00:18:36.565 "write_zeroes": true, 00:18:36.565 "zcopy": true, 00:18:36.565 "get_zone_info": false, 00:18:36.565 "zone_management": false, 00:18:36.565 "zone_append": false, 00:18:36.565 "compare": false, 00:18:36.565 "compare_and_write": false, 00:18:36.565 "abort": true, 00:18:36.565 "seek_hole": false, 00:18:36.565 "seek_data": false, 00:18:36.565 "copy": true, 00:18:36.565 "nvme_iov_md": false 00:18:36.565 }, 00:18:36.565 "memory_domains": [ 00:18:36.565 { 00:18:36.565 "dma_device_id": "system", 00:18:36.565 "dma_device_type": 1 00:18:36.565 }, 00:18:36.565 { 00:18:36.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.565 "dma_device_type": 2 00:18:36.565 } 00:18:36.565 ], 00:18:36.565 "driver_specific": {} 00:18:36.565 } 00:18:36.565 ] 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.565 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.825 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.825 "name": "Existed_Raid", 00:18:36.825 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:36.825 "strip_size_kb": 64, 00:18:36.825 "state": "online", 00:18:36.825 "raid_level": "raid0", 00:18:36.825 "superblock": true, 00:18:36.825 "num_base_bdevs": 4, 00:18:36.825 "num_base_bdevs_discovered": 4, 00:18:36.825 "num_base_bdevs_operational": 4, 00:18:36.825 "base_bdevs_list": [ 00:18:36.825 { 00:18:36.825 "name": "NewBaseBdev", 00:18:36.825 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:36.825 "is_configured": true, 00:18:36.825 "data_offset": 2048, 00:18:36.825 "data_size": 63488 00:18:36.825 }, 00:18:36.825 { 00:18:36.825 "name": "BaseBdev2", 00:18:36.825 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:36.825 "is_configured": true, 00:18:36.825 "data_offset": 2048, 00:18:36.825 "data_size": 63488 00:18:36.825 }, 00:18:36.825 { 00:18:36.825 "name": "BaseBdev3", 00:18:36.825 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:36.825 "is_configured": true, 00:18:36.825 "data_offset": 2048, 00:18:36.825 "data_size": 63488 00:18:36.825 }, 00:18:36.825 { 00:18:36.825 "name": "BaseBdev4", 00:18:36.825 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:36.825 "is_configured": true, 00:18:36.825 "data_offset": 2048, 00:18:36.825 "data_size": 63488 00:18:36.825 } 00:18:36.825 ] 00:18:36.825 }' 00:18:36.825 09:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.825 09:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:37.394 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:37.653 [2024-07-15 09:22:46.437894] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:37.653 "name": "Existed_Raid", 00:18:37.653 "aliases": [ 00:18:37.653 "cf1ebf7c-80ca-43e9-b964-4b314acccdc8" 00:18:37.653 ], 00:18:37.653 "product_name": "Raid Volume", 00:18:37.653 "block_size": 512, 00:18:37.653 "num_blocks": 253952, 00:18:37.653 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:37.653 "assigned_rate_limits": { 00:18:37.653 "rw_ios_per_sec": 0, 00:18:37.653 "rw_mbytes_per_sec": 0, 00:18:37.653 "r_mbytes_per_sec": 0, 00:18:37.653 "w_mbytes_per_sec": 0 00:18:37.653 }, 00:18:37.653 "claimed": false, 00:18:37.653 "zoned": false, 00:18:37.653 "supported_io_types": { 00:18:37.653 "read": true, 00:18:37.653 "write": true, 00:18:37.653 "unmap": true, 00:18:37.653 "flush": true, 00:18:37.653 "reset": true, 00:18:37.653 "nvme_admin": false, 00:18:37.653 "nvme_io": false, 00:18:37.653 "nvme_io_md": false, 00:18:37.653 "write_zeroes": true, 00:18:37.653 "zcopy": false, 00:18:37.653 "get_zone_info": false, 00:18:37.653 "zone_management": false, 00:18:37.653 "zone_append": false, 00:18:37.653 "compare": false, 00:18:37.653 "compare_and_write": false, 00:18:37.653 "abort": false, 00:18:37.653 "seek_hole": false, 00:18:37.653 "seek_data": false, 00:18:37.653 "copy": false, 00:18:37.653 "nvme_iov_md": false 00:18:37.653 }, 00:18:37.653 "memory_domains": [ 00:18:37.653 { 00:18:37.653 "dma_device_id": "system", 00:18:37.653 "dma_device_type": 1 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.653 "dma_device_type": 2 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "system", 00:18:37.653 "dma_device_type": 1 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.653 "dma_device_type": 2 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "system", 00:18:37.653 "dma_device_type": 1 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.653 "dma_device_type": 2 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "system", 00:18:37.653 "dma_device_type": 1 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.653 "dma_device_type": 2 00:18:37.653 } 00:18:37.653 ], 00:18:37.653 "driver_specific": { 00:18:37.653 "raid": { 00:18:37.653 "uuid": "cf1ebf7c-80ca-43e9-b964-4b314acccdc8", 00:18:37.653 "strip_size_kb": 64, 00:18:37.653 "state": "online", 00:18:37.653 "raid_level": "raid0", 00:18:37.653 "superblock": true, 00:18:37.653 "num_base_bdevs": 4, 00:18:37.653 "num_base_bdevs_discovered": 4, 00:18:37.653 "num_base_bdevs_operational": 4, 00:18:37.653 "base_bdevs_list": [ 00:18:37.653 { 00:18:37.653 "name": "NewBaseBdev", 00:18:37.653 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:37.653 "is_configured": true, 00:18:37.653 "data_offset": 2048, 00:18:37.653 "data_size": 63488 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "name": "BaseBdev2", 00:18:37.653 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:37.653 "is_configured": true, 00:18:37.653 "data_offset": 2048, 00:18:37.653 "data_size": 63488 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "name": "BaseBdev3", 00:18:37.653 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:37.653 "is_configured": true, 00:18:37.653 "data_offset": 2048, 00:18:37.653 "data_size": 63488 00:18:37.653 }, 00:18:37.653 { 00:18:37.653 "name": "BaseBdev4", 00:18:37.653 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:37.653 "is_configured": true, 00:18:37.653 "data_offset": 2048, 00:18:37.653 "data_size": 63488 00:18:37.653 } 00:18:37.653 ] 00:18:37.653 } 00:18:37.653 } 00:18:37.653 }' 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:37.653 BaseBdev2 00:18:37.653 BaseBdev3 00:18:37.653 BaseBdev4' 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:37.653 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.912 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.912 "name": "NewBaseBdev", 00:18:37.912 "aliases": [ 00:18:37.912 "b729c00b-45f1-4595-81ce-51107e6a9192" 00:18:37.912 ], 00:18:37.912 "product_name": "Malloc disk", 00:18:37.912 "block_size": 512, 00:18:37.912 "num_blocks": 65536, 00:18:37.912 "uuid": "b729c00b-45f1-4595-81ce-51107e6a9192", 00:18:37.912 "assigned_rate_limits": { 00:18:37.912 "rw_ios_per_sec": 0, 00:18:37.912 "rw_mbytes_per_sec": 0, 00:18:37.912 "r_mbytes_per_sec": 0, 00:18:37.912 "w_mbytes_per_sec": 0 00:18:37.912 }, 00:18:37.912 "claimed": true, 00:18:37.912 "claim_type": "exclusive_write", 00:18:37.912 "zoned": false, 00:18:37.912 "supported_io_types": { 00:18:37.912 "read": true, 00:18:37.912 "write": true, 00:18:37.912 "unmap": true, 00:18:37.912 "flush": true, 00:18:37.912 "reset": true, 00:18:37.912 "nvme_admin": false, 00:18:37.912 "nvme_io": false, 00:18:37.912 "nvme_io_md": false, 00:18:37.912 "write_zeroes": true, 00:18:37.912 "zcopy": true, 00:18:37.912 "get_zone_info": false, 00:18:37.912 "zone_management": false, 00:18:37.912 "zone_append": false, 00:18:37.912 "compare": false, 00:18:37.912 "compare_and_write": false, 00:18:37.912 "abort": true, 00:18:37.912 "seek_hole": false, 00:18:37.912 "seek_data": false, 00:18:37.912 "copy": true, 00:18:37.912 "nvme_iov_md": false 00:18:37.912 }, 00:18:37.912 "memory_domains": [ 00:18:37.912 { 00:18:37.912 "dma_device_id": "system", 00:18:37.912 "dma_device_type": 1 00:18:37.912 }, 00:18:37.912 { 00:18:37.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.912 "dma_device_type": 2 00:18:37.912 } 00:18:37.912 ], 00:18:37.912 "driver_specific": {} 00:18:37.912 }' 00:18:37.912 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.912 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.912 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.912 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.171 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.171 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.171 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.171 09:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:38.171 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.429 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.429 "name": "BaseBdev2", 00:18:38.429 "aliases": [ 00:18:38.429 "96bc10df-b22c-4324-9457-35803887b734" 00:18:38.429 ], 00:18:38.429 "product_name": "Malloc disk", 00:18:38.429 "block_size": 512, 00:18:38.429 "num_blocks": 65536, 00:18:38.429 "uuid": "96bc10df-b22c-4324-9457-35803887b734", 00:18:38.429 "assigned_rate_limits": { 00:18:38.429 "rw_ios_per_sec": 0, 00:18:38.429 "rw_mbytes_per_sec": 0, 00:18:38.429 "r_mbytes_per_sec": 0, 00:18:38.429 "w_mbytes_per_sec": 0 00:18:38.429 }, 00:18:38.429 "claimed": true, 00:18:38.429 "claim_type": "exclusive_write", 00:18:38.429 "zoned": false, 00:18:38.429 "supported_io_types": { 00:18:38.429 "read": true, 00:18:38.429 "write": true, 00:18:38.429 "unmap": true, 00:18:38.429 "flush": true, 00:18:38.429 "reset": true, 00:18:38.429 "nvme_admin": false, 00:18:38.429 "nvme_io": false, 00:18:38.429 "nvme_io_md": false, 00:18:38.429 "write_zeroes": true, 00:18:38.429 "zcopy": true, 00:18:38.429 "get_zone_info": false, 00:18:38.429 "zone_management": false, 00:18:38.429 "zone_append": false, 00:18:38.429 "compare": false, 00:18:38.429 "compare_and_write": false, 00:18:38.429 "abort": true, 00:18:38.429 "seek_hole": false, 00:18:38.429 "seek_data": false, 00:18:38.429 "copy": true, 00:18:38.429 "nvme_iov_md": false 00:18:38.429 }, 00:18:38.429 "memory_domains": [ 00:18:38.429 { 00:18:38.429 "dma_device_id": "system", 00:18:38.429 "dma_device_type": 1 00:18:38.429 }, 00:18:38.429 { 00:18:38.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.429 "dma_device_type": 2 00:18:38.429 } 00:18:38.429 ], 00:18:38.429 "driver_specific": {} 00:18:38.429 }' 00:18:38.429 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.688 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:38.947 09:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.514 "name": "BaseBdev3", 00:18:39.514 "aliases": [ 00:18:39.514 "ae6965ba-40a1-422d-8dc0-d9bd64afa080" 00:18:39.514 ], 00:18:39.514 "product_name": "Malloc disk", 00:18:39.514 "block_size": 512, 00:18:39.514 "num_blocks": 65536, 00:18:39.514 "uuid": "ae6965ba-40a1-422d-8dc0-d9bd64afa080", 00:18:39.514 "assigned_rate_limits": { 00:18:39.514 "rw_ios_per_sec": 0, 00:18:39.514 "rw_mbytes_per_sec": 0, 00:18:39.514 "r_mbytes_per_sec": 0, 00:18:39.514 "w_mbytes_per_sec": 0 00:18:39.514 }, 00:18:39.514 "claimed": true, 00:18:39.514 "claim_type": "exclusive_write", 00:18:39.514 "zoned": false, 00:18:39.514 "supported_io_types": { 00:18:39.514 "read": true, 00:18:39.514 "write": true, 00:18:39.514 "unmap": true, 00:18:39.514 "flush": true, 00:18:39.514 "reset": true, 00:18:39.514 "nvme_admin": false, 00:18:39.514 "nvme_io": false, 00:18:39.514 "nvme_io_md": false, 00:18:39.514 "write_zeroes": true, 00:18:39.514 "zcopy": true, 00:18:39.514 "get_zone_info": false, 00:18:39.514 "zone_management": false, 00:18:39.514 "zone_append": false, 00:18:39.514 "compare": false, 00:18:39.514 "compare_and_write": false, 00:18:39.514 "abort": true, 00:18:39.514 "seek_hole": false, 00:18:39.514 "seek_data": false, 00:18:39.514 "copy": true, 00:18:39.514 "nvme_iov_md": false 00:18:39.514 }, 00:18:39.514 "memory_domains": [ 00:18:39.514 { 00:18:39.514 "dma_device_id": "system", 00:18:39.514 "dma_device_type": 1 00:18:39.514 }, 00:18:39.514 { 00:18:39.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.514 "dma_device_type": 2 00:18:39.514 } 00:18:39.514 ], 00:18:39.514 "driver_specific": {} 00:18:39.514 }' 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.514 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:39.773 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:40.031 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:40.031 "name": "BaseBdev4", 00:18:40.031 "aliases": [ 00:18:40.031 "8b275add-acf2-49b3-8e82-c8dc274ee577" 00:18:40.031 ], 00:18:40.031 "product_name": "Malloc disk", 00:18:40.031 "block_size": 512, 00:18:40.031 "num_blocks": 65536, 00:18:40.031 "uuid": "8b275add-acf2-49b3-8e82-c8dc274ee577", 00:18:40.031 "assigned_rate_limits": { 00:18:40.031 "rw_ios_per_sec": 0, 00:18:40.031 "rw_mbytes_per_sec": 0, 00:18:40.031 "r_mbytes_per_sec": 0, 00:18:40.031 "w_mbytes_per_sec": 0 00:18:40.031 }, 00:18:40.031 "claimed": true, 00:18:40.031 "claim_type": "exclusive_write", 00:18:40.031 "zoned": false, 00:18:40.031 "supported_io_types": { 00:18:40.031 "read": true, 00:18:40.031 "write": true, 00:18:40.031 "unmap": true, 00:18:40.031 "flush": true, 00:18:40.031 "reset": true, 00:18:40.031 "nvme_admin": false, 00:18:40.031 "nvme_io": false, 00:18:40.031 "nvme_io_md": false, 00:18:40.031 "write_zeroes": true, 00:18:40.031 "zcopy": true, 00:18:40.031 "get_zone_info": false, 00:18:40.031 "zone_management": false, 00:18:40.031 "zone_append": false, 00:18:40.031 "compare": false, 00:18:40.031 "compare_and_write": false, 00:18:40.031 "abort": true, 00:18:40.031 "seek_hole": false, 00:18:40.031 "seek_data": false, 00:18:40.031 "copy": true, 00:18:40.031 "nvme_iov_md": false 00:18:40.031 }, 00:18:40.031 "memory_domains": [ 00:18:40.031 { 00:18:40.031 "dma_device_id": "system", 00:18:40.031 "dma_device_type": 1 00:18:40.031 }, 00:18:40.031 { 00:18:40.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.031 "dma_device_type": 2 00:18:40.031 } 00:18:40.031 ], 00:18:40.031 "driver_specific": {} 00:18:40.031 }' 00:18:40.031 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.031 09:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.290 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.559 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:40.559 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.559 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.559 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:40.559 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:40.817 [2024-07-15 09:22:49.573912] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:40.817 [2024-07-15 09:22:49.573948] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.817 [2024-07-15 09:22:49.574003] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.817 [2024-07-15 09:22:49.574068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.817 [2024-07-15 09:22:49.574080] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfcf470 name Existed_Raid, state offline 00:18:40.817 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 150175 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 150175 ']' 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 150175 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 150175 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 150175' 00:18:40.818 killing process with pid 150175 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 150175 00:18:40.818 [2024-07-15 09:22:49.637262] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:40.818 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 150175 00:18:40.818 [2024-07-15 09:22:49.673348] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:41.075 09:22:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:41.075 00:18:41.075 real 0m33.008s 00:18:41.075 user 1m0.685s 00:18:41.075 sys 0m5.820s 00:18:41.075 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:41.075 09:22:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.075 ************************************ 00:18:41.075 END TEST raid_state_function_test_sb 00:18:41.075 ************************************ 00:18:41.075 09:22:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:41.075 09:22:49 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:41.075 09:22:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:41.075 09:22:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:41.075 09:22:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:41.075 ************************************ 00:18:41.075 START TEST raid_superblock_test 00:18:41.075 ************************************ 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=155193 00:18:41.075 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 155193 /var/tmp/spdk-raid.sock 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 155193 ']' 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:41.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:41.076 09:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.333 [2024-07-15 09:22:50.030231] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:18:41.333 [2024-07-15 09:22:50.030302] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155193 ] 00:18:41.334 [2024-07-15 09:22:50.150717] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:41.334 [2024-07-15 09:22:50.257438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:41.592 [2024-07-15 09:22:50.319198] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.592 [2024-07-15 09:22:50.319228] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.158 09:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:42.416 malloc1 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:42.416 [2024-07-15 09:22:51.349260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:42.416 [2024-07-15 09:22:51.349311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.416 [2024-07-15 09:22:51.349331] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c5570 00:18:42.416 [2024-07-15 09:22:51.349344] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.416 [2024-07-15 09:22:51.351110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.416 [2024-07-15 09:22:51.351140] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:42.416 pt1 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.416 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:42.683 malloc2 00:18:42.683 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:42.978 [2024-07-15 09:22:51.787289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:42.978 [2024-07-15 09:22:51.787333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.978 [2024-07-15 09:22:51.787351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c6970 00:18:42.978 [2024-07-15 09:22:51.787363] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.978 [2024-07-15 09:22:51.788888] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.978 [2024-07-15 09:22:51.788916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:42.978 pt2 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.978 09:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:43.235 malloc3 00:18:43.235 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:43.493 [2024-07-15 09:22:52.266437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:43.493 [2024-07-15 09:22:52.266485] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.493 [2024-07-15 09:22:52.266503] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155d340 00:18:43.493 [2024-07-15 09:22:52.266516] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.493 [2024-07-15 09:22:52.268090] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.493 [2024-07-15 09:22:52.268118] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:43.493 pt3 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:43.493 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:43.750 malloc4 00:18:43.750 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:44.008 [2024-07-15 09:22:52.745703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:44.008 [2024-07-15 09:22:52.745752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.008 [2024-07-15 09:22:52.745773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155fc60 00:18:44.008 [2024-07-15 09:22:52.745786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.008 [2024-07-15 09:22:52.747424] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.008 [2024-07-15 09:22:52.747453] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:44.008 pt4 00:18:44.008 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:44.008 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:44.008 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:44.266 [2024-07-15 09:22:52.978345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:44.266 [2024-07-15 09:22:52.979735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:44.266 [2024-07-15 09:22:52.979790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:44.266 [2024-07-15 09:22:52.979834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:44.266 [2024-07-15 09:22:52.980025] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13bd530 00:18:44.266 [2024-07-15 09:22:52.980038] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:44.266 [2024-07-15 09:22:52.980249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bb770 00:18:44.266 [2024-07-15 09:22:52.980400] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13bd530 00:18:44.266 [2024-07-15 09:22:52.980410] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13bd530 00:18:44.266 [2024-07-15 09:22:52.980510] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:44.266 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:44.266 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:44.266 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.267 09:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.523 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.523 "name": "raid_bdev1", 00:18:44.523 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:44.523 "strip_size_kb": 64, 00:18:44.523 "state": "online", 00:18:44.523 "raid_level": "raid0", 00:18:44.523 "superblock": true, 00:18:44.523 "num_base_bdevs": 4, 00:18:44.523 "num_base_bdevs_discovered": 4, 00:18:44.523 "num_base_bdevs_operational": 4, 00:18:44.523 "base_bdevs_list": [ 00:18:44.523 { 00:18:44.523 "name": "pt1", 00:18:44.523 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.523 "is_configured": true, 00:18:44.523 "data_offset": 2048, 00:18:44.523 "data_size": 63488 00:18:44.523 }, 00:18:44.523 { 00:18:44.523 "name": "pt2", 00:18:44.523 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:44.523 "is_configured": true, 00:18:44.523 "data_offset": 2048, 00:18:44.523 "data_size": 63488 00:18:44.523 }, 00:18:44.523 { 00:18:44.523 "name": "pt3", 00:18:44.523 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.523 "is_configured": true, 00:18:44.523 "data_offset": 2048, 00:18:44.523 "data_size": 63488 00:18:44.523 }, 00:18:44.523 { 00:18:44.523 "name": "pt4", 00:18:44.523 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:44.523 "is_configured": true, 00:18:44.523 "data_offset": 2048, 00:18:44.523 "data_size": 63488 00:18:44.523 } 00:18:44.523 ] 00:18:44.523 }' 00:18:44.523 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.523 09:22:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:45.088 09:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:45.346 [2024-07-15 09:22:54.061480] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:45.346 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:45.346 "name": "raid_bdev1", 00:18:45.346 "aliases": [ 00:18:45.346 "05301322-57a4-46a2-b77a-4b942bc36aa1" 00:18:45.346 ], 00:18:45.346 "product_name": "Raid Volume", 00:18:45.346 "block_size": 512, 00:18:45.346 "num_blocks": 253952, 00:18:45.346 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:45.347 "assigned_rate_limits": { 00:18:45.347 "rw_ios_per_sec": 0, 00:18:45.347 "rw_mbytes_per_sec": 0, 00:18:45.347 "r_mbytes_per_sec": 0, 00:18:45.347 "w_mbytes_per_sec": 0 00:18:45.347 }, 00:18:45.347 "claimed": false, 00:18:45.347 "zoned": false, 00:18:45.347 "supported_io_types": { 00:18:45.347 "read": true, 00:18:45.347 "write": true, 00:18:45.347 "unmap": true, 00:18:45.347 "flush": true, 00:18:45.347 "reset": true, 00:18:45.347 "nvme_admin": false, 00:18:45.347 "nvme_io": false, 00:18:45.347 "nvme_io_md": false, 00:18:45.347 "write_zeroes": true, 00:18:45.347 "zcopy": false, 00:18:45.347 "get_zone_info": false, 00:18:45.347 "zone_management": false, 00:18:45.347 "zone_append": false, 00:18:45.347 "compare": false, 00:18:45.347 "compare_and_write": false, 00:18:45.347 "abort": false, 00:18:45.347 "seek_hole": false, 00:18:45.347 "seek_data": false, 00:18:45.347 "copy": false, 00:18:45.347 "nvme_iov_md": false 00:18:45.347 }, 00:18:45.347 "memory_domains": [ 00:18:45.347 { 00:18:45.347 "dma_device_id": "system", 00:18:45.347 "dma_device_type": 1 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.347 "dma_device_type": 2 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "system", 00:18:45.347 "dma_device_type": 1 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.347 "dma_device_type": 2 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "system", 00:18:45.347 "dma_device_type": 1 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.347 "dma_device_type": 2 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "system", 00:18:45.347 "dma_device_type": 1 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.347 "dma_device_type": 2 00:18:45.347 } 00:18:45.347 ], 00:18:45.347 "driver_specific": { 00:18:45.347 "raid": { 00:18:45.347 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:45.347 "strip_size_kb": 64, 00:18:45.347 "state": "online", 00:18:45.347 "raid_level": "raid0", 00:18:45.347 "superblock": true, 00:18:45.347 "num_base_bdevs": 4, 00:18:45.347 "num_base_bdevs_discovered": 4, 00:18:45.347 "num_base_bdevs_operational": 4, 00:18:45.347 "base_bdevs_list": [ 00:18:45.347 { 00:18:45.347 "name": "pt1", 00:18:45.347 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:45.347 "is_configured": true, 00:18:45.347 "data_offset": 2048, 00:18:45.347 "data_size": 63488 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "name": "pt2", 00:18:45.347 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:45.347 "is_configured": true, 00:18:45.347 "data_offset": 2048, 00:18:45.347 "data_size": 63488 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "name": "pt3", 00:18:45.347 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:45.347 "is_configured": true, 00:18:45.347 "data_offset": 2048, 00:18:45.347 "data_size": 63488 00:18:45.347 }, 00:18:45.347 { 00:18:45.347 "name": "pt4", 00:18:45.347 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:45.347 "is_configured": true, 00:18:45.347 "data_offset": 2048, 00:18:45.347 "data_size": 63488 00:18:45.347 } 00:18:45.347 ] 00:18:45.347 } 00:18:45.347 } 00:18:45.347 }' 00:18:45.347 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:45.347 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:45.347 pt2 00:18:45.347 pt3 00:18:45.347 pt4' 00:18:45.347 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.347 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:45.347 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.605 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.605 "name": "pt1", 00:18:45.605 "aliases": [ 00:18:45.605 "00000000-0000-0000-0000-000000000001" 00:18:45.605 ], 00:18:45.605 "product_name": "passthru", 00:18:45.605 "block_size": 512, 00:18:45.605 "num_blocks": 65536, 00:18:45.605 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:45.605 "assigned_rate_limits": { 00:18:45.605 "rw_ios_per_sec": 0, 00:18:45.605 "rw_mbytes_per_sec": 0, 00:18:45.605 "r_mbytes_per_sec": 0, 00:18:45.605 "w_mbytes_per_sec": 0 00:18:45.605 }, 00:18:45.605 "claimed": true, 00:18:45.605 "claim_type": "exclusive_write", 00:18:45.606 "zoned": false, 00:18:45.606 "supported_io_types": { 00:18:45.606 "read": true, 00:18:45.606 "write": true, 00:18:45.606 "unmap": true, 00:18:45.606 "flush": true, 00:18:45.606 "reset": true, 00:18:45.606 "nvme_admin": false, 00:18:45.606 "nvme_io": false, 00:18:45.606 "nvme_io_md": false, 00:18:45.606 "write_zeroes": true, 00:18:45.606 "zcopy": true, 00:18:45.606 "get_zone_info": false, 00:18:45.606 "zone_management": false, 00:18:45.606 "zone_append": false, 00:18:45.606 "compare": false, 00:18:45.606 "compare_and_write": false, 00:18:45.606 "abort": true, 00:18:45.606 "seek_hole": false, 00:18:45.606 "seek_data": false, 00:18:45.606 "copy": true, 00:18:45.606 "nvme_iov_md": false 00:18:45.606 }, 00:18:45.606 "memory_domains": [ 00:18:45.606 { 00:18:45.606 "dma_device_id": "system", 00:18:45.606 "dma_device_type": 1 00:18:45.606 }, 00:18:45.606 { 00:18:45.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.606 "dma_device_type": 2 00:18:45.606 } 00:18:45.606 ], 00:18:45.606 "driver_specific": { 00:18:45.606 "passthru": { 00:18:45.606 "name": "pt1", 00:18:45.606 "base_bdev_name": "malloc1" 00:18:45.606 } 00:18:45.606 } 00:18:45.606 }' 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.606 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:45.864 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.122 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.122 "name": "pt2", 00:18:46.122 "aliases": [ 00:18:46.122 "00000000-0000-0000-0000-000000000002" 00:18:46.122 ], 00:18:46.122 "product_name": "passthru", 00:18:46.122 "block_size": 512, 00:18:46.122 "num_blocks": 65536, 00:18:46.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:46.122 "assigned_rate_limits": { 00:18:46.123 "rw_ios_per_sec": 0, 00:18:46.123 "rw_mbytes_per_sec": 0, 00:18:46.123 "r_mbytes_per_sec": 0, 00:18:46.123 "w_mbytes_per_sec": 0 00:18:46.123 }, 00:18:46.123 "claimed": true, 00:18:46.123 "claim_type": "exclusive_write", 00:18:46.123 "zoned": false, 00:18:46.123 "supported_io_types": { 00:18:46.123 "read": true, 00:18:46.123 "write": true, 00:18:46.123 "unmap": true, 00:18:46.123 "flush": true, 00:18:46.123 "reset": true, 00:18:46.123 "nvme_admin": false, 00:18:46.123 "nvme_io": false, 00:18:46.123 "nvme_io_md": false, 00:18:46.123 "write_zeroes": true, 00:18:46.123 "zcopy": true, 00:18:46.123 "get_zone_info": false, 00:18:46.123 "zone_management": false, 00:18:46.123 "zone_append": false, 00:18:46.123 "compare": false, 00:18:46.123 "compare_and_write": false, 00:18:46.123 "abort": true, 00:18:46.123 "seek_hole": false, 00:18:46.123 "seek_data": false, 00:18:46.123 "copy": true, 00:18:46.123 "nvme_iov_md": false 00:18:46.123 }, 00:18:46.123 "memory_domains": [ 00:18:46.123 { 00:18:46.123 "dma_device_id": "system", 00:18:46.123 "dma_device_type": 1 00:18:46.123 }, 00:18:46.123 { 00:18:46.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.123 "dma_device_type": 2 00:18:46.123 } 00:18:46.123 ], 00:18:46.123 "driver_specific": { 00:18:46.123 "passthru": { 00:18:46.123 "name": "pt2", 00:18:46.123 "base_bdev_name": "malloc2" 00:18:46.123 } 00:18:46.123 } 00:18:46.123 }' 00:18:46.123 09:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.123 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.381 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.639 "name": "pt3", 00:18:46.639 "aliases": [ 00:18:46.639 "00000000-0000-0000-0000-000000000003" 00:18:46.639 ], 00:18:46.639 "product_name": "passthru", 00:18:46.639 "block_size": 512, 00:18:46.639 "num_blocks": 65536, 00:18:46.639 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.639 "assigned_rate_limits": { 00:18:46.639 "rw_ios_per_sec": 0, 00:18:46.639 "rw_mbytes_per_sec": 0, 00:18:46.639 "r_mbytes_per_sec": 0, 00:18:46.639 "w_mbytes_per_sec": 0 00:18:46.639 }, 00:18:46.639 "claimed": true, 00:18:46.639 "claim_type": "exclusive_write", 00:18:46.639 "zoned": false, 00:18:46.639 "supported_io_types": { 00:18:46.639 "read": true, 00:18:46.639 "write": true, 00:18:46.639 "unmap": true, 00:18:46.639 "flush": true, 00:18:46.639 "reset": true, 00:18:46.639 "nvme_admin": false, 00:18:46.639 "nvme_io": false, 00:18:46.639 "nvme_io_md": false, 00:18:46.639 "write_zeroes": true, 00:18:46.639 "zcopy": true, 00:18:46.639 "get_zone_info": false, 00:18:46.639 "zone_management": false, 00:18:46.639 "zone_append": false, 00:18:46.639 "compare": false, 00:18:46.639 "compare_and_write": false, 00:18:46.639 "abort": true, 00:18:46.639 "seek_hole": false, 00:18:46.639 "seek_data": false, 00:18:46.639 "copy": true, 00:18:46.639 "nvme_iov_md": false 00:18:46.639 }, 00:18:46.639 "memory_domains": [ 00:18:46.639 { 00:18:46.639 "dma_device_id": "system", 00:18:46.639 "dma_device_type": 1 00:18:46.639 }, 00:18:46.639 { 00:18:46.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.639 "dma_device_type": 2 00:18:46.639 } 00:18:46.639 ], 00:18:46.639 "driver_specific": { 00:18:46.639 "passthru": { 00:18:46.639 "name": "pt3", 00:18:46.639 "base_bdev_name": "malloc3" 00:18:46.639 } 00:18:46.639 } 00:18:46.639 }' 00:18:46.639 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.898 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.157 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.157 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.157 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.157 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:47.157 09:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.416 "name": "pt4", 00:18:47.416 "aliases": [ 00:18:47.416 "00000000-0000-0000-0000-000000000004" 00:18:47.416 ], 00:18:47.416 "product_name": "passthru", 00:18:47.416 "block_size": 512, 00:18:47.416 "num_blocks": 65536, 00:18:47.416 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:47.416 "assigned_rate_limits": { 00:18:47.416 "rw_ios_per_sec": 0, 00:18:47.416 "rw_mbytes_per_sec": 0, 00:18:47.416 "r_mbytes_per_sec": 0, 00:18:47.416 "w_mbytes_per_sec": 0 00:18:47.416 }, 00:18:47.416 "claimed": true, 00:18:47.416 "claim_type": "exclusive_write", 00:18:47.416 "zoned": false, 00:18:47.416 "supported_io_types": { 00:18:47.416 "read": true, 00:18:47.416 "write": true, 00:18:47.416 "unmap": true, 00:18:47.416 "flush": true, 00:18:47.416 "reset": true, 00:18:47.416 "nvme_admin": false, 00:18:47.416 "nvme_io": false, 00:18:47.416 "nvme_io_md": false, 00:18:47.416 "write_zeroes": true, 00:18:47.416 "zcopy": true, 00:18:47.416 "get_zone_info": false, 00:18:47.416 "zone_management": false, 00:18:47.416 "zone_append": false, 00:18:47.416 "compare": false, 00:18:47.416 "compare_and_write": false, 00:18:47.416 "abort": true, 00:18:47.416 "seek_hole": false, 00:18:47.416 "seek_data": false, 00:18:47.416 "copy": true, 00:18:47.416 "nvme_iov_md": false 00:18:47.416 }, 00:18:47.416 "memory_domains": [ 00:18:47.416 { 00:18:47.416 "dma_device_id": "system", 00:18:47.416 "dma_device_type": 1 00:18:47.416 }, 00:18:47.416 { 00:18:47.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.416 "dma_device_type": 2 00:18:47.416 } 00:18:47.416 ], 00:18:47.416 "driver_specific": { 00:18:47.416 "passthru": { 00:18:47.416 "name": "pt4", 00:18:47.416 "base_bdev_name": "malloc4" 00:18:47.416 } 00:18:47.416 } 00:18:47.416 }' 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.416 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:47.675 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:47.934 [2024-07-15 09:22:56.708515] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:47.934 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=05301322-57a4-46a2-b77a-4b942bc36aa1 00:18:47.934 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 05301322-57a4-46a2-b77a-4b942bc36aa1 ']' 00:18:47.934 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:48.192 [2024-07-15 09:22:56.956866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:48.192 [2024-07-15 09:22:56.956888] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:48.192 [2024-07-15 09:22:56.956946] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:48.192 [2024-07-15 09:22:56.957010] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:48.192 [2024-07-15 09:22:56.957022] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bd530 name raid_bdev1, state offline 00:18:48.192 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.192 09:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.451 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:48.710 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.710 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:48.968 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.968 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:49.227 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:49.227 09:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:49.485 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.743 [2024-07-15 09:22:58.480846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:49.743 [2024-07-15 09:22:58.482189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:49.743 [2024-07-15 09:22:58.482232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:49.743 [2024-07-15 09:22:58.482265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:49.743 [2024-07-15 09:22:58.482310] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:49.743 [2024-07-15 09:22:58.482350] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:49.743 [2024-07-15 09:22:58.482373] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:49.743 [2024-07-15 09:22:58.482403] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:49.743 [2024-07-15 09:22:58.482421] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:49.743 [2024-07-15 09:22:58.482431] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1568ff0 name raid_bdev1, state configuring 00:18:49.743 request: 00:18:49.743 { 00:18:49.743 "name": "raid_bdev1", 00:18:49.743 "raid_level": "raid0", 00:18:49.743 "base_bdevs": [ 00:18:49.743 "malloc1", 00:18:49.743 "malloc2", 00:18:49.743 "malloc3", 00:18:49.743 "malloc4" 00:18:49.743 ], 00:18:49.743 "strip_size_kb": 64, 00:18:49.743 "superblock": false, 00:18:49.743 "method": "bdev_raid_create", 00:18:49.743 "req_id": 1 00:18:49.743 } 00:18:49.743 Got JSON-RPC error response 00:18:49.743 response: 00:18:49.743 { 00:18:49.743 "code": -17, 00:18:49.743 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:49.743 } 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.743 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:50.002 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:50.002 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:50.002 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:50.260 [2024-07-15 09:22:58.974080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:50.260 [2024-07-15 09:22:58.974117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.260 [2024-07-15 09:22:58.974137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c57a0 00:18:50.260 [2024-07-15 09:22:58.974149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.260 [2024-07-15 09:22:58.975628] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.260 [2024-07-15 09:22:58.975654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:50.260 [2024-07-15 09:22:58.975710] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:50.260 [2024-07-15 09:22:58.975735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:50.260 pt1 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.260 09:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:50.519 09:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.519 "name": "raid_bdev1", 00:18:50.519 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:50.519 "strip_size_kb": 64, 00:18:50.519 "state": "configuring", 00:18:50.519 "raid_level": "raid0", 00:18:50.519 "superblock": true, 00:18:50.519 "num_base_bdevs": 4, 00:18:50.519 "num_base_bdevs_discovered": 1, 00:18:50.519 "num_base_bdevs_operational": 4, 00:18:50.519 "base_bdevs_list": [ 00:18:50.519 { 00:18:50.519 "name": "pt1", 00:18:50.520 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:50.520 "is_configured": true, 00:18:50.520 "data_offset": 2048, 00:18:50.520 "data_size": 63488 00:18:50.520 }, 00:18:50.520 { 00:18:50.520 "name": null, 00:18:50.520 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:50.520 "is_configured": false, 00:18:50.520 "data_offset": 2048, 00:18:50.520 "data_size": 63488 00:18:50.520 }, 00:18:50.520 { 00:18:50.520 "name": null, 00:18:50.520 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:50.520 "is_configured": false, 00:18:50.520 "data_offset": 2048, 00:18:50.520 "data_size": 63488 00:18:50.520 }, 00:18:50.520 { 00:18:50.520 "name": null, 00:18:50.520 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:50.520 "is_configured": false, 00:18:50.520 "data_offset": 2048, 00:18:50.520 "data_size": 63488 00:18:50.520 } 00:18:50.520 ] 00:18:50.520 }' 00:18:50.520 09:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.520 09:22:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.087 09:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:51.087 09:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:51.346 [2024-07-15 09:23:00.077031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:51.346 [2024-07-15 09:23:00.077085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.346 [2024-07-15 09:23:00.077106] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155e940 00:18:51.346 [2024-07-15 09:23:00.077119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.346 [2024-07-15 09:23:00.077455] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.346 [2024-07-15 09:23:00.077473] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:51.346 [2024-07-15 09:23:00.077539] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:51.346 [2024-07-15 09:23:00.077558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:51.346 pt2 00:18:51.346 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:51.605 [2024-07-15 09:23:00.321693] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.605 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.864 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.864 "name": "raid_bdev1", 00:18:51.864 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:51.864 "strip_size_kb": 64, 00:18:51.864 "state": "configuring", 00:18:51.864 "raid_level": "raid0", 00:18:51.864 "superblock": true, 00:18:51.864 "num_base_bdevs": 4, 00:18:51.864 "num_base_bdevs_discovered": 1, 00:18:51.864 "num_base_bdevs_operational": 4, 00:18:51.864 "base_bdevs_list": [ 00:18:51.864 { 00:18:51.864 "name": "pt1", 00:18:51.864 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.864 "is_configured": true, 00:18:51.864 "data_offset": 2048, 00:18:51.864 "data_size": 63488 00:18:51.864 }, 00:18:51.864 { 00:18:51.864 "name": null, 00:18:51.864 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.864 "is_configured": false, 00:18:51.864 "data_offset": 2048, 00:18:51.864 "data_size": 63488 00:18:51.864 }, 00:18:51.864 { 00:18:51.864 "name": null, 00:18:51.864 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.864 "is_configured": false, 00:18:51.864 "data_offset": 2048, 00:18:51.864 "data_size": 63488 00:18:51.864 }, 00:18:51.864 { 00:18:51.864 "name": null, 00:18:51.864 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:51.864 "is_configured": false, 00:18:51.864 "data_offset": 2048, 00:18:51.864 "data_size": 63488 00:18:51.864 } 00:18:51.864 ] 00:18:51.864 }' 00:18:51.864 09:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.864 09:23:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.431 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:52.431 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.431 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:52.690 [2024-07-15 09:23:01.404547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:52.690 [2024-07-15 09:23:01.404598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.690 [2024-07-15 09:23:01.404616] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13bc060 00:18:52.690 [2024-07-15 09:23:01.404628] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.690 [2024-07-15 09:23:01.404967] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.690 [2024-07-15 09:23:01.404986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:52.690 [2024-07-15 09:23:01.405050] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:52.690 [2024-07-15 09:23:01.405069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:52.690 pt2 00:18:52.690 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.690 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.690 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:52.949 [2024-07-15 09:23:01.645187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:52.949 [2024-07-15 09:23:01.645222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.949 [2024-07-15 09:23:01.645241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13be8d0 00:18:52.949 [2024-07-15 09:23:01.645253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.949 [2024-07-15 09:23:01.645532] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.949 [2024-07-15 09:23:01.645550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:52.949 [2024-07-15 09:23:01.645601] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:52.949 [2024-07-15 09:23:01.645617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:52.949 pt3 00:18:52.949 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.949 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.949 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:52.949 [2024-07-15 09:23:01.885821] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:52.949 [2024-07-15 09:23:01.885855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.949 [2024-07-15 09:23:01.885870] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13bfb80 00:18:52.949 [2024-07-15 09:23:01.885881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.949 [2024-07-15 09:23:01.886148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.949 [2024-07-15 09:23:01.886169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:52.949 [2024-07-15 09:23:01.886222] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:52.949 [2024-07-15 09:23:01.886240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:52.949 [2024-07-15 09:23:01.886349] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13bc780 00:18:52.949 [2024-07-15 09:23:01.886360] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:52.949 [2024-07-15 09:23:01.886523] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c1d70 00:18:52.949 [2024-07-15 09:23:01.886647] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13bc780 00:18:52.949 [2024-07-15 09:23:01.886657] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13bc780 00:18:52.949 [2024-07-15 09:23:01.886751] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.949 pt4 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.209 09:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.209 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.209 "name": "raid_bdev1", 00:18:53.209 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:53.209 "strip_size_kb": 64, 00:18:53.209 "state": "online", 00:18:53.209 "raid_level": "raid0", 00:18:53.209 "superblock": true, 00:18:53.209 "num_base_bdevs": 4, 00:18:53.209 "num_base_bdevs_discovered": 4, 00:18:53.209 "num_base_bdevs_operational": 4, 00:18:53.209 "base_bdevs_list": [ 00:18:53.209 { 00:18:53.209 "name": "pt1", 00:18:53.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:53.209 "is_configured": true, 00:18:53.209 "data_offset": 2048, 00:18:53.209 "data_size": 63488 00:18:53.209 }, 00:18:53.209 { 00:18:53.209 "name": "pt2", 00:18:53.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:53.209 "is_configured": true, 00:18:53.209 "data_offset": 2048, 00:18:53.209 "data_size": 63488 00:18:53.209 }, 00:18:53.209 { 00:18:53.209 "name": "pt3", 00:18:53.209 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.209 "is_configured": true, 00:18:53.209 "data_offset": 2048, 00:18:53.209 "data_size": 63488 00:18:53.209 }, 00:18:53.209 { 00:18:53.209 "name": "pt4", 00:18:53.209 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:53.209 "is_configured": true, 00:18:53.209 "data_offset": 2048, 00:18:53.209 "data_size": 63488 00:18:53.209 } 00:18:53.209 ] 00:18:53.209 }' 00:18:53.209 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.209 09:23:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.777 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:53.777 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:53.778 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:54.036 [2024-07-15 09:23:02.956983] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:54.036 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:54.036 "name": "raid_bdev1", 00:18:54.036 "aliases": [ 00:18:54.036 "05301322-57a4-46a2-b77a-4b942bc36aa1" 00:18:54.036 ], 00:18:54.036 "product_name": "Raid Volume", 00:18:54.036 "block_size": 512, 00:18:54.036 "num_blocks": 253952, 00:18:54.036 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:54.036 "assigned_rate_limits": { 00:18:54.036 "rw_ios_per_sec": 0, 00:18:54.036 "rw_mbytes_per_sec": 0, 00:18:54.036 "r_mbytes_per_sec": 0, 00:18:54.036 "w_mbytes_per_sec": 0 00:18:54.036 }, 00:18:54.036 "claimed": false, 00:18:54.036 "zoned": false, 00:18:54.036 "supported_io_types": { 00:18:54.036 "read": true, 00:18:54.036 "write": true, 00:18:54.036 "unmap": true, 00:18:54.036 "flush": true, 00:18:54.036 "reset": true, 00:18:54.036 "nvme_admin": false, 00:18:54.036 "nvme_io": false, 00:18:54.036 "nvme_io_md": false, 00:18:54.036 "write_zeroes": true, 00:18:54.036 "zcopy": false, 00:18:54.036 "get_zone_info": false, 00:18:54.036 "zone_management": false, 00:18:54.036 "zone_append": false, 00:18:54.036 "compare": false, 00:18:54.036 "compare_and_write": false, 00:18:54.036 "abort": false, 00:18:54.036 "seek_hole": false, 00:18:54.036 "seek_data": false, 00:18:54.036 "copy": false, 00:18:54.036 "nvme_iov_md": false 00:18:54.036 }, 00:18:54.036 "memory_domains": [ 00:18:54.036 { 00:18:54.036 "dma_device_id": "system", 00:18:54.036 "dma_device_type": 1 00:18:54.036 }, 00:18:54.036 { 00:18:54.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.036 "dma_device_type": 2 00:18:54.036 }, 00:18:54.036 { 00:18:54.036 "dma_device_id": "system", 00:18:54.036 "dma_device_type": 1 00:18:54.036 }, 00:18:54.036 { 00:18:54.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.036 "dma_device_type": 2 00:18:54.036 }, 00:18:54.036 { 00:18:54.037 "dma_device_id": "system", 00:18:54.037 "dma_device_type": 1 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.037 "dma_device_type": 2 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "dma_device_id": "system", 00:18:54.037 "dma_device_type": 1 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.037 "dma_device_type": 2 00:18:54.037 } 00:18:54.037 ], 00:18:54.037 "driver_specific": { 00:18:54.037 "raid": { 00:18:54.037 "uuid": "05301322-57a4-46a2-b77a-4b942bc36aa1", 00:18:54.037 "strip_size_kb": 64, 00:18:54.037 "state": "online", 00:18:54.037 "raid_level": "raid0", 00:18:54.037 "superblock": true, 00:18:54.037 "num_base_bdevs": 4, 00:18:54.037 "num_base_bdevs_discovered": 4, 00:18:54.037 "num_base_bdevs_operational": 4, 00:18:54.037 "base_bdevs_list": [ 00:18:54.037 { 00:18:54.037 "name": "pt1", 00:18:54.037 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:54.037 "is_configured": true, 00:18:54.037 "data_offset": 2048, 00:18:54.037 "data_size": 63488 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "name": "pt2", 00:18:54.037 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:54.037 "is_configured": true, 00:18:54.037 "data_offset": 2048, 00:18:54.037 "data_size": 63488 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "name": "pt3", 00:18:54.037 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:54.037 "is_configured": true, 00:18:54.037 "data_offset": 2048, 00:18:54.037 "data_size": 63488 00:18:54.037 }, 00:18:54.037 { 00:18:54.037 "name": "pt4", 00:18:54.037 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:54.037 "is_configured": true, 00:18:54.037 "data_offset": 2048, 00:18:54.037 "data_size": 63488 00:18:54.037 } 00:18:54.037 ] 00:18:54.037 } 00:18:54.037 } 00:18:54.037 }' 00:18:54.037 09:23:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:54.295 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:54.295 pt2 00:18:54.295 pt3 00:18:54.295 pt4' 00:18:54.295 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.295 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:54.295 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.553 "name": "pt1", 00:18:54.553 "aliases": [ 00:18:54.553 "00000000-0000-0000-0000-000000000001" 00:18:54.553 ], 00:18:54.553 "product_name": "passthru", 00:18:54.553 "block_size": 512, 00:18:54.553 "num_blocks": 65536, 00:18:54.553 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:54.553 "assigned_rate_limits": { 00:18:54.553 "rw_ios_per_sec": 0, 00:18:54.553 "rw_mbytes_per_sec": 0, 00:18:54.553 "r_mbytes_per_sec": 0, 00:18:54.553 "w_mbytes_per_sec": 0 00:18:54.553 }, 00:18:54.553 "claimed": true, 00:18:54.553 "claim_type": "exclusive_write", 00:18:54.553 "zoned": false, 00:18:54.553 "supported_io_types": { 00:18:54.553 "read": true, 00:18:54.553 "write": true, 00:18:54.553 "unmap": true, 00:18:54.553 "flush": true, 00:18:54.553 "reset": true, 00:18:54.553 "nvme_admin": false, 00:18:54.553 "nvme_io": false, 00:18:54.553 "nvme_io_md": false, 00:18:54.553 "write_zeroes": true, 00:18:54.553 "zcopy": true, 00:18:54.553 "get_zone_info": false, 00:18:54.553 "zone_management": false, 00:18:54.553 "zone_append": false, 00:18:54.553 "compare": false, 00:18:54.553 "compare_and_write": false, 00:18:54.553 "abort": true, 00:18:54.553 "seek_hole": false, 00:18:54.553 "seek_data": false, 00:18:54.553 "copy": true, 00:18:54.553 "nvme_iov_md": false 00:18:54.553 }, 00:18:54.553 "memory_domains": [ 00:18:54.553 { 00:18:54.553 "dma_device_id": "system", 00:18:54.553 "dma_device_type": 1 00:18:54.553 }, 00:18:54.553 { 00:18:54.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.553 "dma_device_type": 2 00:18:54.553 } 00:18:54.553 ], 00:18:54.553 "driver_specific": { 00:18:54.553 "passthru": { 00:18:54.553 "name": "pt1", 00:18:54.553 "base_bdev_name": "malloc1" 00:18:54.553 } 00:18:54.553 } 00:18:54.553 }' 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.553 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:54.811 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.069 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.069 "name": "pt2", 00:18:55.069 "aliases": [ 00:18:55.069 "00000000-0000-0000-0000-000000000002" 00:18:55.069 ], 00:18:55.069 "product_name": "passthru", 00:18:55.069 "block_size": 512, 00:18:55.069 "num_blocks": 65536, 00:18:55.069 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:55.069 "assigned_rate_limits": { 00:18:55.069 "rw_ios_per_sec": 0, 00:18:55.069 "rw_mbytes_per_sec": 0, 00:18:55.069 "r_mbytes_per_sec": 0, 00:18:55.069 "w_mbytes_per_sec": 0 00:18:55.069 }, 00:18:55.069 "claimed": true, 00:18:55.069 "claim_type": "exclusive_write", 00:18:55.069 "zoned": false, 00:18:55.069 "supported_io_types": { 00:18:55.069 "read": true, 00:18:55.069 "write": true, 00:18:55.069 "unmap": true, 00:18:55.069 "flush": true, 00:18:55.069 "reset": true, 00:18:55.069 "nvme_admin": false, 00:18:55.069 "nvme_io": false, 00:18:55.069 "nvme_io_md": false, 00:18:55.069 "write_zeroes": true, 00:18:55.069 "zcopy": true, 00:18:55.069 "get_zone_info": false, 00:18:55.069 "zone_management": false, 00:18:55.069 "zone_append": false, 00:18:55.069 "compare": false, 00:18:55.069 "compare_and_write": false, 00:18:55.069 "abort": true, 00:18:55.069 "seek_hole": false, 00:18:55.069 "seek_data": false, 00:18:55.069 "copy": true, 00:18:55.069 "nvme_iov_md": false 00:18:55.069 }, 00:18:55.069 "memory_domains": [ 00:18:55.069 { 00:18:55.069 "dma_device_id": "system", 00:18:55.069 "dma_device_type": 1 00:18:55.069 }, 00:18:55.069 { 00:18:55.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.069 "dma_device_type": 2 00:18:55.069 } 00:18:55.069 ], 00:18:55.069 "driver_specific": { 00:18:55.069 "passthru": { 00:18:55.069 "name": "pt2", 00:18:55.069 "base_bdev_name": "malloc2" 00:18:55.069 } 00:18:55.069 } 00:18:55.069 }' 00:18:55.069 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.069 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.069 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.069 09:23:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.069 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:55.328 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.586 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.586 "name": "pt3", 00:18:55.586 "aliases": [ 00:18:55.586 "00000000-0000-0000-0000-000000000003" 00:18:55.586 ], 00:18:55.586 "product_name": "passthru", 00:18:55.586 "block_size": 512, 00:18:55.586 "num_blocks": 65536, 00:18:55.586 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:55.586 "assigned_rate_limits": { 00:18:55.586 "rw_ios_per_sec": 0, 00:18:55.586 "rw_mbytes_per_sec": 0, 00:18:55.586 "r_mbytes_per_sec": 0, 00:18:55.586 "w_mbytes_per_sec": 0 00:18:55.586 }, 00:18:55.586 "claimed": true, 00:18:55.586 "claim_type": "exclusive_write", 00:18:55.586 "zoned": false, 00:18:55.586 "supported_io_types": { 00:18:55.586 "read": true, 00:18:55.586 "write": true, 00:18:55.586 "unmap": true, 00:18:55.586 "flush": true, 00:18:55.586 "reset": true, 00:18:55.586 "nvme_admin": false, 00:18:55.586 "nvme_io": false, 00:18:55.586 "nvme_io_md": false, 00:18:55.586 "write_zeroes": true, 00:18:55.586 "zcopy": true, 00:18:55.586 "get_zone_info": false, 00:18:55.586 "zone_management": false, 00:18:55.586 "zone_append": false, 00:18:55.586 "compare": false, 00:18:55.586 "compare_and_write": false, 00:18:55.586 "abort": true, 00:18:55.586 "seek_hole": false, 00:18:55.586 "seek_data": false, 00:18:55.586 "copy": true, 00:18:55.586 "nvme_iov_md": false 00:18:55.586 }, 00:18:55.586 "memory_domains": [ 00:18:55.586 { 00:18:55.586 "dma_device_id": "system", 00:18:55.586 "dma_device_type": 1 00:18:55.586 }, 00:18:55.586 { 00:18:55.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.586 "dma_device_type": 2 00:18:55.586 } 00:18:55.586 ], 00:18:55.586 "driver_specific": { 00:18:55.586 "passthru": { 00:18:55.586 "name": "pt3", 00:18:55.586 "base_bdev_name": "malloc3" 00:18:55.586 } 00:18:55.586 } 00:18:55.586 }' 00:18:55.586 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.586 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.845 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.104 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.104 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.104 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.104 09:23:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:56.104 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:56.104 "name": "pt4", 00:18:56.104 "aliases": [ 00:18:56.104 "00000000-0000-0000-0000-000000000004" 00:18:56.104 ], 00:18:56.104 "product_name": "passthru", 00:18:56.104 "block_size": 512, 00:18:56.104 "num_blocks": 65536, 00:18:56.104 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:56.104 "assigned_rate_limits": { 00:18:56.104 "rw_ios_per_sec": 0, 00:18:56.104 "rw_mbytes_per_sec": 0, 00:18:56.104 "r_mbytes_per_sec": 0, 00:18:56.104 "w_mbytes_per_sec": 0 00:18:56.104 }, 00:18:56.104 "claimed": true, 00:18:56.104 "claim_type": "exclusive_write", 00:18:56.104 "zoned": false, 00:18:56.104 "supported_io_types": { 00:18:56.104 "read": true, 00:18:56.104 "write": true, 00:18:56.104 "unmap": true, 00:18:56.104 "flush": true, 00:18:56.104 "reset": true, 00:18:56.104 "nvme_admin": false, 00:18:56.104 "nvme_io": false, 00:18:56.104 "nvme_io_md": false, 00:18:56.104 "write_zeroes": true, 00:18:56.104 "zcopy": true, 00:18:56.104 "get_zone_info": false, 00:18:56.104 "zone_management": false, 00:18:56.104 "zone_append": false, 00:18:56.104 "compare": false, 00:18:56.104 "compare_and_write": false, 00:18:56.104 "abort": true, 00:18:56.104 "seek_hole": false, 00:18:56.104 "seek_data": false, 00:18:56.104 "copy": true, 00:18:56.104 "nvme_iov_md": false 00:18:56.104 }, 00:18:56.104 "memory_domains": [ 00:18:56.104 { 00:18:56.104 "dma_device_id": "system", 00:18:56.104 "dma_device_type": 1 00:18:56.104 }, 00:18:56.104 { 00:18:56.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.104 "dma_device_type": 2 00:18:56.104 } 00:18:56.104 ], 00:18:56.104 "driver_specific": { 00:18:56.104 "passthru": { 00:18:56.104 "name": "pt4", 00:18:56.104 "base_bdev_name": "malloc4" 00:18:56.104 } 00:18:56.104 } 00:18:56.104 }' 00:18:56.104 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.363 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.695 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.695 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.695 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:56.695 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:56.695 [2024-07-15 09:23:05.608054] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 05301322-57a4-46a2-b77a-4b942bc36aa1 '!=' 05301322-57a4-46a2-b77a-4b942bc36aa1 ']' 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 155193 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 155193 ']' 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 155193 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 155193 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 155193' 00:18:56.972 killing process with pid 155193 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 155193 00:18:56.972 [2024-07-15 09:23:05.683360] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:56.972 [2024-07-15 09:23:05.683426] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:56.972 [2024-07-15 09:23:05.683493] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:56.972 [2024-07-15 09:23:05.683507] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bc780 name raid_bdev1, state offline 00:18:56.972 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 155193 00:18:56.972 [2024-07-15 09:23:05.723717] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:57.232 09:23:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:57.232 00:18:57.232 real 0m15.984s 00:18:57.232 user 0m28.765s 00:18:57.232 sys 0m2.909s 00:18:57.232 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:57.232 09:23:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.232 ************************************ 00:18:57.232 END TEST raid_superblock_test 00:18:57.232 ************************************ 00:18:57.232 09:23:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:57.232 09:23:05 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:57.232 09:23:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:57.232 09:23:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:57.232 09:23:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:57.232 ************************************ 00:18:57.232 START TEST raid_read_error_test 00:18:57.232 ************************************ 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.duxvOjEVYn 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=157533 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 157533 /var/tmp/spdk-raid.sock 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 157533 ']' 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:57.232 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:57.232 09:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.232 [2024-07-15 09:23:06.116917] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:18:57.232 [2024-07-15 09:23:06.116989] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157533 ] 00:18:57.491 [2024-07-15 09:23:06.247511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:57.491 [2024-07-15 09:23:06.353240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:57.491 [2024-07-15 09:23:06.425244] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.491 [2024-07-15 09:23:06.425300] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.429 09:23:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:58.429 09:23:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:58.429 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:58.429 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:58.429 BaseBdev1_malloc 00:18:58.429 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:58.687 true 00:18:58.687 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:58.946 [2024-07-15 09:23:07.772897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:58.946 [2024-07-15 09:23:07.772949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.946 [2024-07-15 09:23:07.772970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19d10d0 00:18:58.946 [2024-07-15 09:23:07.772983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.946 [2024-07-15 09:23:07.774861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.946 [2024-07-15 09:23:07.774893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:58.946 BaseBdev1 00:18:58.946 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:58.946 09:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:59.205 BaseBdev2_malloc 00:18:59.205 09:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:59.464 true 00:18:59.464 09:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:59.723 [2024-07-15 09:23:08.508658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:59.723 [2024-07-15 09:23:08.508704] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.723 [2024-07-15 09:23:08.508726] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19d5910 00:18:59.723 [2024-07-15 09:23:08.508739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.723 [2024-07-15 09:23:08.510352] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.723 [2024-07-15 09:23:08.510384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:59.723 BaseBdev2 00:18:59.723 09:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.723 09:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:59.982 BaseBdev3_malloc 00:18:59.982 09:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:00.241 true 00:19:00.241 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:00.500 [2024-07-15 09:23:09.235127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:00.500 [2024-07-15 09:23:09.235174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.500 [2024-07-15 09:23:09.235195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19d7bd0 00:19:00.500 [2024-07-15 09:23:09.235207] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.500 [2024-07-15 09:23:09.236781] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.500 [2024-07-15 09:23:09.236810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:00.500 BaseBdev3 00:19:00.500 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:00.500 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:00.759 BaseBdev4_malloc 00:19:00.759 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:01.018 true 00:19:01.018 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:01.018 [2024-07-15 09:23:09.963028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:01.018 [2024-07-15 09:23:09.963072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.018 [2024-07-15 09:23:09.963092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19d8aa0 00:19:01.018 [2024-07-15 09:23:09.963105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.018 [2024-07-15 09:23:09.964665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.018 [2024-07-15 09:23:09.964693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:01.018 BaseBdev4 00:19:01.278 09:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:01.278 [2024-07-15 09:23:10.207719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:01.278 [2024-07-15 09:23:10.209161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:01.278 [2024-07-15 09:23:10.209238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:01.278 [2024-07-15 09:23:10.209300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:01.278 [2024-07-15 09:23:10.209532] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19d2c20 00:19:01.278 [2024-07-15 09:23:10.209544] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:01.278 [2024-07-15 09:23:10.209743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1827260 00:19:01.278 [2024-07-15 09:23:10.209896] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19d2c20 00:19:01.278 [2024-07-15 09:23:10.209906] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19d2c20 00:19:01.278 [2024-07-15 09:23:10.210023] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.278 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.537 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.537 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.538 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.538 "name": "raid_bdev1", 00:19:01.538 "uuid": "9e02704c-22ac-459e-9ece-5199852175d7", 00:19:01.538 "strip_size_kb": 64, 00:19:01.538 "state": "online", 00:19:01.538 "raid_level": "raid0", 00:19:01.538 "superblock": true, 00:19:01.538 "num_base_bdevs": 4, 00:19:01.538 "num_base_bdevs_discovered": 4, 00:19:01.538 "num_base_bdevs_operational": 4, 00:19:01.538 "base_bdevs_list": [ 00:19:01.538 { 00:19:01.538 "name": "BaseBdev1", 00:19:01.538 "uuid": "cae87133-3d52-5250-8c4e-cbf19c9eaf8a", 00:19:01.538 "is_configured": true, 00:19:01.538 "data_offset": 2048, 00:19:01.538 "data_size": 63488 00:19:01.538 }, 00:19:01.538 { 00:19:01.538 "name": "BaseBdev2", 00:19:01.538 "uuid": "9c7c6acd-1508-5434-aa2b-7f6644a4e371", 00:19:01.538 "is_configured": true, 00:19:01.538 "data_offset": 2048, 00:19:01.538 "data_size": 63488 00:19:01.538 }, 00:19:01.538 { 00:19:01.538 "name": "BaseBdev3", 00:19:01.538 "uuid": "48382da4-bc41-5815-b937-5bc5b46cec83", 00:19:01.538 "is_configured": true, 00:19:01.538 "data_offset": 2048, 00:19:01.538 "data_size": 63488 00:19:01.538 }, 00:19:01.538 { 00:19:01.538 "name": "BaseBdev4", 00:19:01.538 "uuid": "8c8a6fc0-4627-5721-965d-8cb8c7176426", 00:19:01.538 "is_configured": true, 00:19:01.538 "data_offset": 2048, 00:19:01.538 "data_size": 63488 00:19:01.538 } 00:19:01.538 ] 00:19:01.538 }' 00:19:01.538 09:23:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.538 09:23:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.475 09:23:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:02.475 09:23:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:02.475 [2024-07-15 09:23:11.142623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c4fc0 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.413 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.672 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.672 "name": "raid_bdev1", 00:19:03.672 "uuid": "9e02704c-22ac-459e-9ece-5199852175d7", 00:19:03.672 "strip_size_kb": 64, 00:19:03.672 "state": "online", 00:19:03.672 "raid_level": "raid0", 00:19:03.672 "superblock": true, 00:19:03.672 "num_base_bdevs": 4, 00:19:03.672 "num_base_bdevs_discovered": 4, 00:19:03.672 "num_base_bdevs_operational": 4, 00:19:03.672 "base_bdevs_list": [ 00:19:03.672 { 00:19:03.672 "name": "BaseBdev1", 00:19:03.672 "uuid": "cae87133-3d52-5250-8c4e-cbf19c9eaf8a", 00:19:03.672 "is_configured": true, 00:19:03.672 "data_offset": 2048, 00:19:03.672 "data_size": 63488 00:19:03.672 }, 00:19:03.672 { 00:19:03.672 "name": "BaseBdev2", 00:19:03.672 "uuid": "9c7c6acd-1508-5434-aa2b-7f6644a4e371", 00:19:03.672 "is_configured": true, 00:19:03.672 "data_offset": 2048, 00:19:03.672 "data_size": 63488 00:19:03.672 }, 00:19:03.672 { 00:19:03.672 "name": "BaseBdev3", 00:19:03.672 "uuid": "48382da4-bc41-5815-b937-5bc5b46cec83", 00:19:03.672 "is_configured": true, 00:19:03.672 "data_offset": 2048, 00:19:03.672 "data_size": 63488 00:19:03.672 }, 00:19:03.672 { 00:19:03.672 "name": "BaseBdev4", 00:19:03.672 "uuid": "8c8a6fc0-4627-5721-965d-8cb8c7176426", 00:19:03.672 "is_configured": true, 00:19:03.672 "data_offset": 2048, 00:19:03.672 "data_size": 63488 00:19:03.673 } 00:19:03.673 ] 00:19:03.673 }' 00:19:03.673 09:23:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.673 09:23:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.241 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:04.499 [2024-07-15 09:23:13.307231] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:04.499 [2024-07-15 09:23:13.307270] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.499 [2024-07-15 09:23:13.310426] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.499 [2024-07-15 09:23:13.310463] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:04.499 [2024-07-15 09:23:13.310505] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:04.499 [2024-07-15 09:23:13.310517] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d2c20 name raid_bdev1, state offline 00:19:04.499 0 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 157533 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 157533 ']' 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 157533 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 157533 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 157533' 00:19:04.499 killing process with pid 157533 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 157533 00:19:04.499 [2024-07-15 09:23:13.376791] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:04.499 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 157533 00:19:04.499 [2024-07-15 09:23:13.412563] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.duxvOjEVYn 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:04.757 09:23:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:19:04.757 00:19:04.757 real 0m7.619s 00:19:04.757 user 0m12.082s 00:19:04.757 sys 0m1.390s 00:19:04.758 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:04.758 09:23:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.758 ************************************ 00:19:04.758 END TEST raid_read_error_test 00:19:04.758 ************************************ 00:19:04.758 09:23:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:04.758 09:23:13 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:04.758 09:23:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:04.758 09:23:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:04.758 09:23:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:05.017 ************************************ 00:19:05.017 START TEST raid_write_error_test 00:19:05.017 ************************************ 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.LjDIBluBfy 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=158646 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 158646 /var/tmp/spdk-raid.sock 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 158646 ']' 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:05.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:05.017 09:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.017 [2024-07-15 09:23:13.826450] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:19:05.017 [2024-07-15 09:23:13.826523] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158646 ] 00:19:05.017 [2024-07-15 09:23:13.954963] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:05.276 [2024-07-15 09:23:14.058088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:05.276 [2024-07-15 09:23:14.116912] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.276 [2024-07-15 09:23:14.116972] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.844 09:23:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:05.844 09:23:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:05.844 09:23:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:05.844 09:23:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:06.103 BaseBdev1_malloc 00:19:06.103 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:06.361 true 00:19:06.361 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:06.620 [2024-07-15 09:23:15.333715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:06.620 [2024-07-15 09:23:15.333762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.620 [2024-07-15 09:23:15.333781] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d7b0d0 00:19:06.620 [2024-07-15 09:23:15.333794] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.620 [2024-07-15 09:23:15.335479] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.620 [2024-07-15 09:23:15.335508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:06.620 BaseBdev1 00:19:06.620 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:06.620 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:06.620 BaseBdev2_malloc 00:19:06.620 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:06.879 true 00:19:06.879 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:07.138 [2024-07-15 09:23:15.936020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:07.138 [2024-07-15 09:23:15.936069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.138 [2024-07-15 09:23:15.936092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d7f910 00:19:07.138 [2024-07-15 09:23:15.936105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.138 [2024-07-15 09:23:15.937595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.138 [2024-07-15 09:23:15.937624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:07.138 BaseBdev2 00:19:07.138 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.138 09:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:07.397 BaseBdev3_malloc 00:19:07.397 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:07.655 true 00:19:07.655 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:07.655 [2024-07-15 09:23:16.534277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:07.655 [2024-07-15 09:23:16.534322] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.655 [2024-07-15 09:23:16.534342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d81bd0 00:19:07.655 [2024-07-15 09:23:16.534355] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.655 [2024-07-15 09:23:16.535732] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.655 [2024-07-15 09:23:16.535761] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:07.655 BaseBdev3 00:19:07.655 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.655 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:07.913 BaseBdev4_malloc 00:19:07.913 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:08.171 true 00:19:08.171 09:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:08.171 [2024-07-15 09:23:17.064303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:08.171 [2024-07-15 09:23:17.064348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.171 [2024-07-15 09:23:17.064367] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d82aa0 00:19:08.171 [2024-07-15 09:23:17.064379] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.171 [2024-07-15 09:23:17.065761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.171 [2024-07-15 09:23:17.065787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:08.171 BaseBdev4 00:19:08.171 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:08.429 [2024-07-15 09:23:17.308998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:08.429 [2024-07-15 09:23:17.310310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:08.429 [2024-07-15 09:23:17.310378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:08.429 [2024-07-15 09:23:17.310439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:08.429 [2024-07-15 09:23:17.310669] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d7cc20 00:19:08.429 [2024-07-15 09:23:17.310681] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:08.429 [2024-07-15 09:23:17.310880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bd1260 00:19:08.429 [2024-07-15 09:23:17.311040] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d7cc20 00:19:08.429 [2024-07-15 09:23:17.311050] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d7cc20 00:19:08.429 [2024-07-15 09:23:17.311150] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.429 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.687 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.687 "name": "raid_bdev1", 00:19:08.687 "uuid": "769ec973-2bcb-4cb9-b76c-bde3c1f7595f", 00:19:08.687 "strip_size_kb": 64, 00:19:08.687 "state": "online", 00:19:08.687 "raid_level": "raid0", 00:19:08.687 "superblock": true, 00:19:08.687 "num_base_bdevs": 4, 00:19:08.687 "num_base_bdevs_discovered": 4, 00:19:08.687 "num_base_bdevs_operational": 4, 00:19:08.687 "base_bdevs_list": [ 00:19:08.687 { 00:19:08.687 "name": "BaseBdev1", 00:19:08.687 "uuid": "b2b6a429-9214-54fa-b6aa-1f21dcbbd8b8", 00:19:08.687 "is_configured": true, 00:19:08.687 "data_offset": 2048, 00:19:08.687 "data_size": 63488 00:19:08.687 }, 00:19:08.687 { 00:19:08.687 "name": "BaseBdev2", 00:19:08.687 "uuid": "8c0ef74d-a991-57fe-bebc-29c5fcaafa94", 00:19:08.687 "is_configured": true, 00:19:08.687 "data_offset": 2048, 00:19:08.687 "data_size": 63488 00:19:08.687 }, 00:19:08.687 { 00:19:08.687 "name": "BaseBdev3", 00:19:08.687 "uuid": "f45b2799-4650-50ff-8345-4d18d1059920", 00:19:08.687 "is_configured": true, 00:19:08.687 "data_offset": 2048, 00:19:08.687 "data_size": 63488 00:19:08.687 }, 00:19:08.687 { 00:19:08.687 "name": "BaseBdev4", 00:19:08.687 "uuid": "1c7648db-2d36-5f06-8c16-28577c3b8fd9", 00:19:08.687 "is_configured": true, 00:19:08.687 "data_offset": 2048, 00:19:08.687 "data_size": 63488 00:19:08.687 } 00:19:08.687 ] 00:19:08.687 }' 00:19:08.687 09:23:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.687 09:23:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.253 09:23:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:09.253 09:23:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:09.511 [2024-07-15 09:23:18.251760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6efc0 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.444 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.701 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.702 "name": "raid_bdev1", 00:19:10.702 "uuid": "769ec973-2bcb-4cb9-b76c-bde3c1f7595f", 00:19:10.702 "strip_size_kb": 64, 00:19:10.702 "state": "online", 00:19:10.702 "raid_level": "raid0", 00:19:10.702 "superblock": true, 00:19:10.702 "num_base_bdevs": 4, 00:19:10.702 "num_base_bdevs_discovered": 4, 00:19:10.702 "num_base_bdevs_operational": 4, 00:19:10.702 "base_bdevs_list": [ 00:19:10.702 { 00:19:10.702 "name": "BaseBdev1", 00:19:10.702 "uuid": "b2b6a429-9214-54fa-b6aa-1f21dcbbd8b8", 00:19:10.702 "is_configured": true, 00:19:10.702 "data_offset": 2048, 00:19:10.702 "data_size": 63488 00:19:10.702 }, 00:19:10.702 { 00:19:10.702 "name": "BaseBdev2", 00:19:10.702 "uuid": "8c0ef74d-a991-57fe-bebc-29c5fcaafa94", 00:19:10.702 "is_configured": true, 00:19:10.702 "data_offset": 2048, 00:19:10.702 "data_size": 63488 00:19:10.702 }, 00:19:10.702 { 00:19:10.702 "name": "BaseBdev3", 00:19:10.702 "uuid": "f45b2799-4650-50ff-8345-4d18d1059920", 00:19:10.702 "is_configured": true, 00:19:10.702 "data_offset": 2048, 00:19:10.702 "data_size": 63488 00:19:10.702 }, 00:19:10.702 { 00:19:10.702 "name": "BaseBdev4", 00:19:10.702 "uuid": "1c7648db-2d36-5f06-8c16-28577c3b8fd9", 00:19:10.702 "is_configured": true, 00:19:10.702 "data_offset": 2048, 00:19:10.702 "data_size": 63488 00:19:10.702 } 00:19:10.702 ] 00:19:10.702 }' 00:19:10.702 09:23:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.702 09:23:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.267 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:11.572 [2024-07-15 09:23:20.412182] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:11.572 [2024-07-15 09:23:20.412229] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:11.572 [2024-07-15 09:23:20.415402] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:11.572 [2024-07-15 09:23:20.415441] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.572 [2024-07-15 09:23:20.415481] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:11.572 [2024-07-15 09:23:20.415493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d7cc20 name raid_bdev1, state offline 00:19:11.572 0 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 158646 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 158646 ']' 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 158646 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 158646 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 158646' 00:19:11.572 killing process with pid 158646 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 158646 00:19:11.572 [2024-07-15 09:23:20.481902] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:11.572 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 158646 00:19:11.572 [2024-07-15 09:23:20.513561] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.LjDIBluBfy 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:19:11.838 00:19:11.838 real 0m7.011s 00:19:11.838 user 0m10.971s 00:19:11.838 sys 0m1.297s 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:11.838 09:23:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.838 ************************************ 00:19:11.838 END TEST raid_write_error_test 00:19:11.838 ************************************ 00:19:12.098 09:23:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:12.098 09:23:20 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:12.098 09:23:20 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:12.098 09:23:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:12.098 09:23:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:12.098 09:23:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:12.098 ************************************ 00:19:12.098 START TEST raid_state_function_test 00:19:12.098 ************************************ 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:12.098 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=159629 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 159629' 00:19:12.099 Process raid pid: 159629 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 159629 /var/tmp/spdk-raid.sock 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 159629 ']' 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:12.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:12.099 09:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.099 [2024-07-15 09:23:20.905430] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:19:12.099 [2024-07-15 09:23:20.905496] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:12.099 [2024-07-15 09:23:21.036253] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:12.359 [2024-07-15 09:23:21.141004] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:12.359 [2024-07-15 09:23:21.206894] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:12.359 [2024-07-15 09:23:21.206924] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:12.926 09:23:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:12.926 09:23:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:12.926 09:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:13.185 [2024-07-15 09:23:22.065499] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:13.185 [2024-07-15 09:23:22.065546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:13.185 [2024-07-15 09:23:22.065557] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:13.185 [2024-07-15 09:23:22.065569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:13.185 [2024-07-15 09:23:22.065578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:13.185 [2024-07-15 09:23:22.065589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:13.185 [2024-07-15 09:23:22.065598] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:13.185 [2024-07-15 09:23:22.065609] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.185 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:13.444 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.444 "name": "Existed_Raid", 00:19:13.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.444 "strip_size_kb": 64, 00:19:13.444 "state": "configuring", 00:19:13.444 "raid_level": "concat", 00:19:13.444 "superblock": false, 00:19:13.444 "num_base_bdevs": 4, 00:19:13.444 "num_base_bdevs_discovered": 0, 00:19:13.444 "num_base_bdevs_operational": 4, 00:19:13.444 "base_bdevs_list": [ 00:19:13.444 { 00:19:13.444 "name": "BaseBdev1", 00:19:13.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.444 "is_configured": false, 00:19:13.444 "data_offset": 0, 00:19:13.444 "data_size": 0 00:19:13.444 }, 00:19:13.444 { 00:19:13.444 "name": "BaseBdev2", 00:19:13.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.444 "is_configured": false, 00:19:13.444 "data_offset": 0, 00:19:13.444 "data_size": 0 00:19:13.444 }, 00:19:13.444 { 00:19:13.444 "name": "BaseBdev3", 00:19:13.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.444 "is_configured": false, 00:19:13.444 "data_offset": 0, 00:19:13.444 "data_size": 0 00:19:13.444 }, 00:19:13.444 { 00:19:13.444 "name": "BaseBdev4", 00:19:13.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.444 "is_configured": false, 00:19:13.444 "data_offset": 0, 00:19:13.444 "data_size": 0 00:19:13.444 } 00:19:13.444 ] 00:19:13.444 }' 00:19:13.444 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.444 09:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.012 09:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.271 [2024-07-15 09:23:23.068026] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.271 [2024-07-15 09:23:23.068056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0aa0 name Existed_Raid, state configuring 00:19:14.271 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:14.529 [2024-07-15 09:23:23.244515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:14.529 [2024-07-15 09:23:23.244542] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:14.529 [2024-07-15 09:23:23.244552] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:14.529 [2024-07-15 09:23:23.244564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:14.529 [2024-07-15 09:23:23.244573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:14.529 [2024-07-15 09:23:23.244585] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:14.529 [2024-07-15 09:23:23.244594] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:14.529 [2024-07-15 09:23:23.244605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:14.529 [2024-07-15 09:23:23.430843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:14.529 BaseBdev1 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:14.529 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.787 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:15.045 [ 00:19:15.045 { 00:19:15.045 "name": "BaseBdev1", 00:19:15.045 "aliases": [ 00:19:15.045 "8a61f8c7-0684-4b24-9e78-09bc54500aff" 00:19:15.045 ], 00:19:15.045 "product_name": "Malloc disk", 00:19:15.045 "block_size": 512, 00:19:15.045 "num_blocks": 65536, 00:19:15.045 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:15.045 "assigned_rate_limits": { 00:19:15.045 "rw_ios_per_sec": 0, 00:19:15.045 "rw_mbytes_per_sec": 0, 00:19:15.045 "r_mbytes_per_sec": 0, 00:19:15.045 "w_mbytes_per_sec": 0 00:19:15.045 }, 00:19:15.045 "claimed": true, 00:19:15.045 "claim_type": "exclusive_write", 00:19:15.045 "zoned": false, 00:19:15.045 "supported_io_types": { 00:19:15.045 "read": true, 00:19:15.045 "write": true, 00:19:15.045 "unmap": true, 00:19:15.045 "flush": true, 00:19:15.045 "reset": true, 00:19:15.045 "nvme_admin": false, 00:19:15.045 "nvme_io": false, 00:19:15.045 "nvme_io_md": false, 00:19:15.045 "write_zeroes": true, 00:19:15.045 "zcopy": true, 00:19:15.045 "get_zone_info": false, 00:19:15.045 "zone_management": false, 00:19:15.045 "zone_append": false, 00:19:15.045 "compare": false, 00:19:15.045 "compare_and_write": false, 00:19:15.045 "abort": true, 00:19:15.045 "seek_hole": false, 00:19:15.045 "seek_data": false, 00:19:15.045 "copy": true, 00:19:15.045 "nvme_iov_md": false 00:19:15.045 }, 00:19:15.045 "memory_domains": [ 00:19:15.045 { 00:19:15.045 "dma_device_id": "system", 00:19:15.045 "dma_device_type": 1 00:19:15.045 }, 00:19:15.045 { 00:19:15.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.045 "dma_device_type": 2 00:19:15.045 } 00:19:15.045 ], 00:19:15.045 "driver_specific": {} 00:19:15.045 } 00:19:15.045 ] 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.045 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.046 09:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.304 09:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.304 "name": "Existed_Raid", 00:19:15.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.304 "strip_size_kb": 64, 00:19:15.304 "state": "configuring", 00:19:15.304 "raid_level": "concat", 00:19:15.304 "superblock": false, 00:19:15.304 "num_base_bdevs": 4, 00:19:15.304 "num_base_bdevs_discovered": 1, 00:19:15.304 "num_base_bdevs_operational": 4, 00:19:15.304 "base_bdevs_list": [ 00:19:15.304 { 00:19:15.304 "name": "BaseBdev1", 00:19:15.304 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:15.304 "is_configured": true, 00:19:15.304 "data_offset": 0, 00:19:15.304 "data_size": 65536 00:19:15.304 }, 00:19:15.304 { 00:19:15.304 "name": "BaseBdev2", 00:19:15.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.304 "is_configured": false, 00:19:15.304 "data_offset": 0, 00:19:15.305 "data_size": 0 00:19:15.305 }, 00:19:15.305 { 00:19:15.305 "name": "BaseBdev3", 00:19:15.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.305 "is_configured": false, 00:19:15.305 "data_offset": 0, 00:19:15.305 "data_size": 0 00:19:15.305 }, 00:19:15.305 { 00:19:15.305 "name": "BaseBdev4", 00:19:15.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.305 "is_configured": false, 00:19:15.305 "data_offset": 0, 00:19:15.305 "data_size": 0 00:19:15.305 } 00:19:15.305 ] 00:19:15.305 }' 00:19:15.305 09:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.305 09:23:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.872 09:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:15.872 [2024-07-15 09:23:24.818689] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:15.872 [2024-07-15 09:23:24.818730] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0310 name Existed_Raid, state configuring 00:19:16.131 09:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:16.131 [2024-07-15 09:23:24.995201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:16.131 [2024-07-15 09:23:24.996643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:16.131 [2024-07-15 09:23:24.996678] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:16.131 [2024-07-15 09:23:24.996688] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:16.131 [2024-07-15 09:23:24.996700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:16.131 [2024-07-15 09:23:24.996709] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:16.131 [2024-07-15 09:23:24.996720] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.131 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.132 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.391 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.391 "name": "Existed_Raid", 00:19:16.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.391 "strip_size_kb": 64, 00:19:16.391 "state": "configuring", 00:19:16.391 "raid_level": "concat", 00:19:16.391 "superblock": false, 00:19:16.391 "num_base_bdevs": 4, 00:19:16.391 "num_base_bdevs_discovered": 1, 00:19:16.391 "num_base_bdevs_operational": 4, 00:19:16.391 "base_bdevs_list": [ 00:19:16.391 { 00:19:16.391 "name": "BaseBdev1", 00:19:16.391 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:16.391 "is_configured": true, 00:19:16.391 "data_offset": 0, 00:19:16.391 "data_size": 65536 00:19:16.391 }, 00:19:16.391 { 00:19:16.391 "name": "BaseBdev2", 00:19:16.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.391 "is_configured": false, 00:19:16.391 "data_offset": 0, 00:19:16.391 "data_size": 0 00:19:16.391 }, 00:19:16.391 { 00:19:16.391 "name": "BaseBdev3", 00:19:16.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.391 "is_configured": false, 00:19:16.391 "data_offset": 0, 00:19:16.391 "data_size": 0 00:19:16.391 }, 00:19:16.391 { 00:19:16.391 "name": "BaseBdev4", 00:19:16.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.391 "is_configured": false, 00:19:16.391 "data_offset": 0, 00:19:16.391 "data_size": 0 00:19:16.391 } 00:19:16.391 ] 00:19:16.391 }' 00:19:16.391 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.391 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.959 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:17.219 [2024-07-15 09:23:25.953181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:17.219 BaseBdev2 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.219 09:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:17.219 09:23:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:17.479 [ 00:19:17.479 { 00:19:17.479 "name": "BaseBdev2", 00:19:17.479 "aliases": [ 00:19:17.479 "8ce33147-1df3-4f20-b063-acc031d51373" 00:19:17.479 ], 00:19:17.479 "product_name": "Malloc disk", 00:19:17.479 "block_size": 512, 00:19:17.479 "num_blocks": 65536, 00:19:17.479 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:17.479 "assigned_rate_limits": { 00:19:17.479 "rw_ios_per_sec": 0, 00:19:17.479 "rw_mbytes_per_sec": 0, 00:19:17.479 "r_mbytes_per_sec": 0, 00:19:17.479 "w_mbytes_per_sec": 0 00:19:17.479 }, 00:19:17.479 "claimed": true, 00:19:17.479 "claim_type": "exclusive_write", 00:19:17.479 "zoned": false, 00:19:17.479 "supported_io_types": { 00:19:17.479 "read": true, 00:19:17.479 "write": true, 00:19:17.479 "unmap": true, 00:19:17.479 "flush": true, 00:19:17.479 "reset": true, 00:19:17.479 "nvme_admin": false, 00:19:17.479 "nvme_io": false, 00:19:17.479 "nvme_io_md": false, 00:19:17.479 "write_zeroes": true, 00:19:17.479 "zcopy": true, 00:19:17.479 "get_zone_info": false, 00:19:17.479 "zone_management": false, 00:19:17.479 "zone_append": false, 00:19:17.479 "compare": false, 00:19:17.479 "compare_and_write": false, 00:19:17.479 "abort": true, 00:19:17.479 "seek_hole": false, 00:19:17.479 "seek_data": false, 00:19:17.479 "copy": true, 00:19:17.479 "nvme_iov_md": false 00:19:17.479 }, 00:19:17.479 "memory_domains": [ 00:19:17.479 { 00:19:17.479 "dma_device_id": "system", 00:19:17.479 "dma_device_type": 1 00:19:17.479 }, 00:19:17.479 { 00:19:17.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.479 "dma_device_type": 2 00:19:17.479 } 00:19:17.479 ], 00:19:17.479 "driver_specific": {} 00:19:17.479 } 00:19:17.479 ] 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.479 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.739 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.739 "name": "Existed_Raid", 00:19:17.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.739 "strip_size_kb": 64, 00:19:17.739 "state": "configuring", 00:19:17.739 "raid_level": "concat", 00:19:17.739 "superblock": false, 00:19:17.739 "num_base_bdevs": 4, 00:19:17.739 "num_base_bdevs_discovered": 2, 00:19:17.739 "num_base_bdevs_operational": 4, 00:19:17.739 "base_bdevs_list": [ 00:19:17.739 { 00:19:17.739 "name": "BaseBdev1", 00:19:17.739 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:17.739 "is_configured": true, 00:19:17.739 "data_offset": 0, 00:19:17.739 "data_size": 65536 00:19:17.739 }, 00:19:17.739 { 00:19:17.739 "name": "BaseBdev2", 00:19:17.739 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:17.739 "is_configured": true, 00:19:17.739 "data_offset": 0, 00:19:17.739 "data_size": 65536 00:19:17.739 }, 00:19:17.739 { 00:19:17.739 "name": "BaseBdev3", 00:19:17.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.739 "is_configured": false, 00:19:17.739 "data_offset": 0, 00:19:17.739 "data_size": 0 00:19:17.739 }, 00:19:17.739 { 00:19:17.739 "name": "BaseBdev4", 00:19:17.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.739 "is_configured": false, 00:19:17.739 "data_offset": 0, 00:19:17.739 "data_size": 0 00:19:17.739 } 00:19:17.739 ] 00:19:17.739 }' 00:19:17.739 09:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.739 09:23:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.307 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:18.307 [2024-07-15 09:23:27.244034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:18.307 BaseBdev3 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.566 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:18.826 [ 00:19:18.826 { 00:19:18.826 "name": "BaseBdev3", 00:19:18.826 "aliases": [ 00:19:18.826 "5f26eb25-d95c-41ca-ac22-2784151cc50f" 00:19:18.826 ], 00:19:18.826 "product_name": "Malloc disk", 00:19:18.826 "block_size": 512, 00:19:18.826 "num_blocks": 65536, 00:19:18.826 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:18.826 "assigned_rate_limits": { 00:19:18.826 "rw_ios_per_sec": 0, 00:19:18.826 "rw_mbytes_per_sec": 0, 00:19:18.826 "r_mbytes_per_sec": 0, 00:19:18.826 "w_mbytes_per_sec": 0 00:19:18.826 }, 00:19:18.826 "claimed": true, 00:19:18.826 "claim_type": "exclusive_write", 00:19:18.826 "zoned": false, 00:19:18.826 "supported_io_types": { 00:19:18.826 "read": true, 00:19:18.826 "write": true, 00:19:18.826 "unmap": true, 00:19:18.826 "flush": true, 00:19:18.826 "reset": true, 00:19:18.826 "nvme_admin": false, 00:19:18.826 "nvme_io": false, 00:19:18.826 "nvme_io_md": false, 00:19:18.826 "write_zeroes": true, 00:19:18.826 "zcopy": true, 00:19:18.826 "get_zone_info": false, 00:19:18.826 "zone_management": false, 00:19:18.826 "zone_append": false, 00:19:18.826 "compare": false, 00:19:18.826 "compare_and_write": false, 00:19:18.826 "abort": true, 00:19:18.826 "seek_hole": false, 00:19:18.826 "seek_data": false, 00:19:18.826 "copy": true, 00:19:18.826 "nvme_iov_md": false 00:19:18.826 }, 00:19:18.826 "memory_domains": [ 00:19:18.826 { 00:19:18.826 "dma_device_id": "system", 00:19:18.826 "dma_device_type": 1 00:19:18.826 }, 00:19:18.826 { 00:19:18.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.826 "dma_device_type": 2 00:19:18.826 } 00:19:18.826 ], 00:19:18.826 "driver_specific": {} 00:19:18.826 } 00:19:18.826 ] 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.826 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.826 "name": "Existed_Raid", 00:19:18.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.826 "strip_size_kb": 64, 00:19:18.826 "state": "configuring", 00:19:18.826 "raid_level": "concat", 00:19:18.826 "superblock": false, 00:19:18.826 "num_base_bdevs": 4, 00:19:18.826 "num_base_bdevs_discovered": 3, 00:19:18.826 "num_base_bdevs_operational": 4, 00:19:18.826 "base_bdevs_list": [ 00:19:18.826 { 00:19:18.826 "name": "BaseBdev1", 00:19:18.826 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:18.826 "is_configured": true, 00:19:18.826 "data_offset": 0, 00:19:18.826 "data_size": 65536 00:19:18.826 }, 00:19:18.826 { 00:19:18.826 "name": "BaseBdev2", 00:19:18.826 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:18.826 "is_configured": true, 00:19:18.826 "data_offset": 0, 00:19:18.826 "data_size": 65536 00:19:18.826 }, 00:19:18.826 { 00:19:18.826 "name": "BaseBdev3", 00:19:18.826 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:18.826 "is_configured": true, 00:19:18.826 "data_offset": 0, 00:19:18.826 "data_size": 65536 00:19:18.826 }, 00:19:18.826 { 00:19:18.827 "name": "BaseBdev4", 00:19:18.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.827 "is_configured": false, 00:19:18.827 "data_offset": 0, 00:19:18.827 "data_size": 0 00:19:18.827 } 00:19:18.827 ] 00:19:18.827 }' 00:19:18.827 09:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.827 09:23:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.394 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:19.654 [2024-07-15 09:23:28.534914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:19.654 [2024-07-15 09:23:28.534963] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e1350 00:19:19.654 [2024-07-15 09:23:28.534972] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:19.654 [2024-07-15 09:23:28.535222] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e1020 00:19:19.654 [2024-07-15 09:23:28.535346] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e1350 00:19:19.654 [2024-07-15 09:23:28.535356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e1350 00:19:19.654 [2024-07-15 09:23:28.535520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:19.654 BaseBdev4 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:19.654 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.913 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:20.173 [ 00:19:20.173 { 00:19:20.173 "name": "BaseBdev4", 00:19:20.173 "aliases": [ 00:19:20.173 "505288b7-f693-4110-a57a-a710b448db5f" 00:19:20.173 ], 00:19:20.173 "product_name": "Malloc disk", 00:19:20.173 "block_size": 512, 00:19:20.173 "num_blocks": 65536, 00:19:20.173 "uuid": "505288b7-f693-4110-a57a-a710b448db5f", 00:19:20.173 "assigned_rate_limits": { 00:19:20.173 "rw_ios_per_sec": 0, 00:19:20.173 "rw_mbytes_per_sec": 0, 00:19:20.173 "r_mbytes_per_sec": 0, 00:19:20.173 "w_mbytes_per_sec": 0 00:19:20.173 }, 00:19:20.173 "claimed": true, 00:19:20.174 "claim_type": "exclusive_write", 00:19:20.174 "zoned": false, 00:19:20.174 "supported_io_types": { 00:19:20.174 "read": true, 00:19:20.174 "write": true, 00:19:20.174 "unmap": true, 00:19:20.174 "flush": true, 00:19:20.174 "reset": true, 00:19:20.174 "nvme_admin": false, 00:19:20.174 "nvme_io": false, 00:19:20.174 "nvme_io_md": false, 00:19:20.174 "write_zeroes": true, 00:19:20.174 "zcopy": true, 00:19:20.174 "get_zone_info": false, 00:19:20.174 "zone_management": false, 00:19:20.174 "zone_append": false, 00:19:20.174 "compare": false, 00:19:20.174 "compare_and_write": false, 00:19:20.174 "abort": true, 00:19:20.174 "seek_hole": false, 00:19:20.174 "seek_data": false, 00:19:20.174 "copy": true, 00:19:20.174 "nvme_iov_md": false 00:19:20.174 }, 00:19:20.174 "memory_domains": [ 00:19:20.174 { 00:19:20.174 "dma_device_id": "system", 00:19:20.174 "dma_device_type": 1 00:19:20.174 }, 00:19:20.174 { 00:19:20.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.174 "dma_device_type": 2 00:19:20.174 } 00:19:20.174 ], 00:19:20.174 "driver_specific": {} 00:19:20.174 } 00:19:20.174 ] 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.174 09:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.174 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.174 "name": "Existed_Raid", 00:19:20.174 "uuid": "3da1a9fe-c094-4c5e-87b5-06e8bb2beabe", 00:19:20.174 "strip_size_kb": 64, 00:19:20.174 "state": "online", 00:19:20.174 "raid_level": "concat", 00:19:20.174 "superblock": false, 00:19:20.174 "num_base_bdevs": 4, 00:19:20.174 "num_base_bdevs_discovered": 4, 00:19:20.174 "num_base_bdevs_operational": 4, 00:19:20.174 "base_bdevs_list": [ 00:19:20.174 { 00:19:20.174 "name": "BaseBdev1", 00:19:20.174 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:20.174 "is_configured": true, 00:19:20.174 "data_offset": 0, 00:19:20.174 "data_size": 65536 00:19:20.174 }, 00:19:20.174 { 00:19:20.174 "name": "BaseBdev2", 00:19:20.174 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:20.174 "is_configured": true, 00:19:20.174 "data_offset": 0, 00:19:20.174 "data_size": 65536 00:19:20.174 }, 00:19:20.174 { 00:19:20.174 "name": "BaseBdev3", 00:19:20.174 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:20.174 "is_configured": true, 00:19:20.174 "data_offset": 0, 00:19:20.174 "data_size": 65536 00:19:20.174 }, 00:19:20.174 { 00:19:20.174 "name": "BaseBdev4", 00:19:20.174 "uuid": "505288b7-f693-4110-a57a-a710b448db5f", 00:19:20.174 "is_configured": true, 00:19:20.174 "data_offset": 0, 00:19:20.174 "data_size": 65536 00:19:20.174 } 00:19:20.174 ] 00:19:20.174 }' 00:19:20.174 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.174 09:23:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:20.742 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:21.002 [2024-07-15 09:23:29.850760] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:21.002 "name": "Existed_Raid", 00:19:21.002 "aliases": [ 00:19:21.002 "3da1a9fe-c094-4c5e-87b5-06e8bb2beabe" 00:19:21.002 ], 00:19:21.002 "product_name": "Raid Volume", 00:19:21.002 "block_size": 512, 00:19:21.002 "num_blocks": 262144, 00:19:21.002 "uuid": "3da1a9fe-c094-4c5e-87b5-06e8bb2beabe", 00:19:21.002 "assigned_rate_limits": { 00:19:21.002 "rw_ios_per_sec": 0, 00:19:21.002 "rw_mbytes_per_sec": 0, 00:19:21.002 "r_mbytes_per_sec": 0, 00:19:21.002 "w_mbytes_per_sec": 0 00:19:21.002 }, 00:19:21.002 "claimed": false, 00:19:21.002 "zoned": false, 00:19:21.002 "supported_io_types": { 00:19:21.002 "read": true, 00:19:21.002 "write": true, 00:19:21.002 "unmap": true, 00:19:21.002 "flush": true, 00:19:21.002 "reset": true, 00:19:21.002 "nvme_admin": false, 00:19:21.002 "nvme_io": false, 00:19:21.002 "nvme_io_md": false, 00:19:21.002 "write_zeroes": true, 00:19:21.002 "zcopy": false, 00:19:21.002 "get_zone_info": false, 00:19:21.002 "zone_management": false, 00:19:21.002 "zone_append": false, 00:19:21.002 "compare": false, 00:19:21.002 "compare_and_write": false, 00:19:21.002 "abort": false, 00:19:21.002 "seek_hole": false, 00:19:21.002 "seek_data": false, 00:19:21.002 "copy": false, 00:19:21.002 "nvme_iov_md": false 00:19:21.002 }, 00:19:21.002 "memory_domains": [ 00:19:21.002 { 00:19:21.002 "dma_device_id": "system", 00:19:21.002 "dma_device_type": 1 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.002 "dma_device_type": 2 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "system", 00:19:21.002 "dma_device_type": 1 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.002 "dma_device_type": 2 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "system", 00:19:21.002 "dma_device_type": 1 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.002 "dma_device_type": 2 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "system", 00:19:21.002 "dma_device_type": 1 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.002 "dma_device_type": 2 00:19:21.002 } 00:19:21.002 ], 00:19:21.002 "driver_specific": { 00:19:21.002 "raid": { 00:19:21.002 "uuid": "3da1a9fe-c094-4c5e-87b5-06e8bb2beabe", 00:19:21.002 "strip_size_kb": 64, 00:19:21.002 "state": "online", 00:19:21.002 "raid_level": "concat", 00:19:21.002 "superblock": false, 00:19:21.002 "num_base_bdevs": 4, 00:19:21.002 "num_base_bdevs_discovered": 4, 00:19:21.002 "num_base_bdevs_operational": 4, 00:19:21.002 "base_bdevs_list": [ 00:19:21.002 { 00:19:21.002 "name": "BaseBdev1", 00:19:21.002 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:21.002 "is_configured": true, 00:19:21.002 "data_offset": 0, 00:19:21.002 "data_size": 65536 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "name": "BaseBdev2", 00:19:21.002 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:21.002 "is_configured": true, 00:19:21.002 "data_offset": 0, 00:19:21.002 "data_size": 65536 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "name": "BaseBdev3", 00:19:21.002 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:21.002 "is_configured": true, 00:19:21.002 "data_offset": 0, 00:19:21.002 "data_size": 65536 00:19:21.002 }, 00:19:21.002 { 00:19:21.002 "name": "BaseBdev4", 00:19:21.002 "uuid": "505288b7-f693-4110-a57a-a710b448db5f", 00:19:21.002 "is_configured": true, 00:19:21.002 "data_offset": 0, 00:19:21.002 "data_size": 65536 00:19:21.002 } 00:19:21.002 ] 00:19:21.002 } 00:19:21.002 } 00:19:21.002 }' 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:21.002 BaseBdev2 00:19:21.002 BaseBdev3 00:19:21.002 BaseBdev4' 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:21.002 09:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.261 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.261 "name": "BaseBdev1", 00:19:21.261 "aliases": [ 00:19:21.261 "8a61f8c7-0684-4b24-9e78-09bc54500aff" 00:19:21.261 ], 00:19:21.261 "product_name": "Malloc disk", 00:19:21.261 "block_size": 512, 00:19:21.261 "num_blocks": 65536, 00:19:21.261 "uuid": "8a61f8c7-0684-4b24-9e78-09bc54500aff", 00:19:21.261 "assigned_rate_limits": { 00:19:21.261 "rw_ios_per_sec": 0, 00:19:21.261 "rw_mbytes_per_sec": 0, 00:19:21.261 "r_mbytes_per_sec": 0, 00:19:21.261 "w_mbytes_per_sec": 0 00:19:21.261 }, 00:19:21.261 "claimed": true, 00:19:21.261 "claim_type": "exclusive_write", 00:19:21.261 "zoned": false, 00:19:21.261 "supported_io_types": { 00:19:21.261 "read": true, 00:19:21.261 "write": true, 00:19:21.261 "unmap": true, 00:19:21.261 "flush": true, 00:19:21.261 "reset": true, 00:19:21.261 "nvme_admin": false, 00:19:21.261 "nvme_io": false, 00:19:21.261 "nvme_io_md": false, 00:19:21.261 "write_zeroes": true, 00:19:21.261 "zcopy": true, 00:19:21.261 "get_zone_info": false, 00:19:21.261 "zone_management": false, 00:19:21.261 "zone_append": false, 00:19:21.261 "compare": false, 00:19:21.261 "compare_and_write": false, 00:19:21.261 "abort": true, 00:19:21.261 "seek_hole": false, 00:19:21.261 "seek_data": false, 00:19:21.261 "copy": true, 00:19:21.261 "nvme_iov_md": false 00:19:21.261 }, 00:19:21.261 "memory_domains": [ 00:19:21.261 { 00:19:21.261 "dma_device_id": "system", 00:19:21.261 "dma_device_type": 1 00:19:21.261 }, 00:19:21.261 { 00:19:21.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.261 "dma_device_type": 2 00:19:21.261 } 00:19:21.261 ], 00:19:21.261 "driver_specific": {} 00:19:21.261 }' 00:19:21.261 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.261 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:21.520 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.088 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.088 "name": "BaseBdev2", 00:19:22.088 "aliases": [ 00:19:22.088 "8ce33147-1df3-4f20-b063-acc031d51373" 00:19:22.088 ], 00:19:22.088 "product_name": "Malloc disk", 00:19:22.088 "block_size": 512, 00:19:22.088 "num_blocks": 65536, 00:19:22.088 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:22.088 "assigned_rate_limits": { 00:19:22.088 "rw_ios_per_sec": 0, 00:19:22.088 "rw_mbytes_per_sec": 0, 00:19:22.088 "r_mbytes_per_sec": 0, 00:19:22.088 "w_mbytes_per_sec": 0 00:19:22.088 }, 00:19:22.088 "claimed": true, 00:19:22.088 "claim_type": "exclusive_write", 00:19:22.088 "zoned": false, 00:19:22.088 "supported_io_types": { 00:19:22.088 "read": true, 00:19:22.088 "write": true, 00:19:22.088 "unmap": true, 00:19:22.088 "flush": true, 00:19:22.088 "reset": true, 00:19:22.088 "nvme_admin": false, 00:19:22.088 "nvme_io": false, 00:19:22.088 "nvme_io_md": false, 00:19:22.088 "write_zeroes": true, 00:19:22.088 "zcopy": true, 00:19:22.088 "get_zone_info": false, 00:19:22.088 "zone_management": false, 00:19:22.088 "zone_append": false, 00:19:22.088 "compare": false, 00:19:22.088 "compare_and_write": false, 00:19:22.088 "abort": true, 00:19:22.088 "seek_hole": false, 00:19:22.088 "seek_data": false, 00:19:22.088 "copy": true, 00:19:22.088 "nvme_iov_md": false 00:19:22.088 }, 00:19:22.088 "memory_domains": [ 00:19:22.088 { 00:19:22.088 "dma_device_id": "system", 00:19:22.088 "dma_device_type": 1 00:19:22.088 }, 00:19:22.088 { 00:19:22.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.088 "dma_device_type": 2 00:19:22.088 } 00:19:22.088 ], 00:19:22.088 "driver_specific": {} 00:19:22.088 }' 00:19:22.088 09:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.088 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.347 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:22.347 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:22.348 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.607 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.607 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.607 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.607 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:22.607 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.175 "name": "BaseBdev3", 00:19:23.175 "aliases": [ 00:19:23.175 "5f26eb25-d95c-41ca-ac22-2784151cc50f" 00:19:23.175 ], 00:19:23.175 "product_name": "Malloc disk", 00:19:23.175 "block_size": 512, 00:19:23.175 "num_blocks": 65536, 00:19:23.175 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:23.175 "assigned_rate_limits": { 00:19:23.175 "rw_ios_per_sec": 0, 00:19:23.175 "rw_mbytes_per_sec": 0, 00:19:23.175 "r_mbytes_per_sec": 0, 00:19:23.175 "w_mbytes_per_sec": 0 00:19:23.175 }, 00:19:23.175 "claimed": true, 00:19:23.175 "claim_type": "exclusive_write", 00:19:23.175 "zoned": false, 00:19:23.175 "supported_io_types": { 00:19:23.175 "read": true, 00:19:23.175 "write": true, 00:19:23.175 "unmap": true, 00:19:23.175 "flush": true, 00:19:23.175 "reset": true, 00:19:23.175 "nvme_admin": false, 00:19:23.175 "nvme_io": false, 00:19:23.175 "nvme_io_md": false, 00:19:23.175 "write_zeroes": true, 00:19:23.175 "zcopy": true, 00:19:23.175 "get_zone_info": false, 00:19:23.175 "zone_management": false, 00:19:23.175 "zone_append": false, 00:19:23.175 "compare": false, 00:19:23.175 "compare_and_write": false, 00:19:23.175 "abort": true, 00:19:23.175 "seek_hole": false, 00:19:23.175 "seek_data": false, 00:19:23.175 "copy": true, 00:19:23.175 "nvme_iov_md": false 00:19:23.175 }, 00:19:23.175 "memory_domains": [ 00:19:23.175 { 00:19:23.175 "dma_device_id": "system", 00:19:23.175 "dma_device_type": 1 00:19:23.175 }, 00:19:23.175 { 00:19:23.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.175 "dma_device_type": 2 00:19:23.175 } 00:19:23.175 ], 00:19:23.175 "driver_specific": {} 00:19:23.175 }' 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.175 09:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.175 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.175 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.175 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.175 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.175 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.434 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.434 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.434 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.434 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:23.434 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.693 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.693 "name": "BaseBdev4", 00:19:23.693 "aliases": [ 00:19:23.693 "505288b7-f693-4110-a57a-a710b448db5f" 00:19:23.693 ], 00:19:23.693 "product_name": "Malloc disk", 00:19:23.693 "block_size": 512, 00:19:23.693 "num_blocks": 65536, 00:19:23.693 "uuid": "505288b7-f693-4110-a57a-a710b448db5f", 00:19:23.693 "assigned_rate_limits": { 00:19:23.693 "rw_ios_per_sec": 0, 00:19:23.693 "rw_mbytes_per_sec": 0, 00:19:23.694 "r_mbytes_per_sec": 0, 00:19:23.694 "w_mbytes_per_sec": 0 00:19:23.694 }, 00:19:23.694 "claimed": true, 00:19:23.694 "claim_type": "exclusive_write", 00:19:23.694 "zoned": false, 00:19:23.694 "supported_io_types": { 00:19:23.694 "read": true, 00:19:23.694 "write": true, 00:19:23.694 "unmap": true, 00:19:23.694 "flush": true, 00:19:23.694 "reset": true, 00:19:23.694 "nvme_admin": false, 00:19:23.694 "nvme_io": false, 00:19:23.694 "nvme_io_md": false, 00:19:23.694 "write_zeroes": true, 00:19:23.694 "zcopy": true, 00:19:23.694 "get_zone_info": false, 00:19:23.694 "zone_management": false, 00:19:23.694 "zone_append": false, 00:19:23.694 "compare": false, 00:19:23.694 "compare_and_write": false, 00:19:23.694 "abort": true, 00:19:23.694 "seek_hole": false, 00:19:23.694 "seek_data": false, 00:19:23.694 "copy": true, 00:19:23.694 "nvme_iov_md": false 00:19:23.694 }, 00:19:23.694 "memory_domains": [ 00:19:23.694 { 00:19:23.694 "dma_device_id": "system", 00:19:23.694 "dma_device_type": 1 00:19:23.694 }, 00:19:23.694 { 00:19:23.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.694 "dma_device_type": 2 00:19:23.694 } 00:19:23.694 ], 00:19:23.694 "driver_specific": {} 00:19:23.694 }' 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.694 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.953 09:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:24.212 [2024-07-15 09:23:33.015077] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:24.212 [2024-07-15 09:23:33.015110] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:24.212 [2024-07-15 09:23:33.015160] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.212 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.470 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.470 "name": "Existed_Raid", 00:19:24.470 "uuid": "3da1a9fe-c094-4c5e-87b5-06e8bb2beabe", 00:19:24.470 "strip_size_kb": 64, 00:19:24.470 "state": "offline", 00:19:24.470 "raid_level": "concat", 00:19:24.470 "superblock": false, 00:19:24.470 "num_base_bdevs": 4, 00:19:24.470 "num_base_bdevs_discovered": 3, 00:19:24.470 "num_base_bdevs_operational": 3, 00:19:24.470 "base_bdevs_list": [ 00:19:24.470 { 00:19:24.470 "name": null, 00:19:24.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.470 "is_configured": false, 00:19:24.470 "data_offset": 0, 00:19:24.470 "data_size": 65536 00:19:24.470 }, 00:19:24.470 { 00:19:24.470 "name": "BaseBdev2", 00:19:24.470 "uuid": "8ce33147-1df3-4f20-b063-acc031d51373", 00:19:24.470 "is_configured": true, 00:19:24.470 "data_offset": 0, 00:19:24.470 "data_size": 65536 00:19:24.470 }, 00:19:24.470 { 00:19:24.470 "name": "BaseBdev3", 00:19:24.470 "uuid": "5f26eb25-d95c-41ca-ac22-2784151cc50f", 00:19:24.470 "is_configured": true, 00:19:24.470 "data_offset": 0, 00:19:24.470 "data_size": 65536 00:19:24.470 }, 00:19:24.470 { 00:19:24.470 "name": "BaseBdev4", 00:19:24.470 "uuid": "505288b7-f693-4110-a57a-a710b448db5f", 00:19:24.470 "is_configured": true, 00:19:24.470 "data_offset": 0, 00:19:24.470 "data_size": 65536 00:19:24.470 } 00:19:24.470 ] 00:19:24.470 }' 00:19:24.471 09:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.471 09:23:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:25.406 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:25.675 [2024-07-15 09:23:34.488924] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:25.675 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:25.675 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:25.675 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.675 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:25.956 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:25.956 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:25.956 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:26.230 [2024-07-15 09:23:34.974682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:26.230 09:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:26.230 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.230 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.230 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.489 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.489 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.489 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:27.057 [2024-07-15 09:23:35.723805] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:27.057 [2024-07-15 09:23:35.723848] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e1350 name Existed_Raid, state offline 00:19:27.057 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.057 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.057 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.057 09:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:27.316 BaseBdev2 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:27.316 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.574 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:27.833 [ 00:19:27.833 { 00:19:27.833 "name": "BaseBdev2", 00:19:27.833 "aliases": [ 00:19:27.833 "15624a6c-8875-4b6b-a2be-9eac8854a168" 00:19:27.833 ], 00:19:27.833 "product_name": "Malloc disk", 00:19:27.833 "block_size": 512, 00:19:27.833 "num_blocks": 65536, 00:19:27.833 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:27.833 "assigned_rate_limits": { 00:19:27.833 "rw_ios_per_sec": 0, 00:19:27.833 "rw_mbytes_per_sec": 0, 00:19:27.833 "r_mbytes_per_sec": 0, 00:19:27.833 "w_mbytes_per_sec": 0 00:19:27.833 }, 00:19:27.833 "claimed": false, 00:19:27.833 "zoned": false, 00:19:27.833 "supported_io_types": { 00:19:27.833 "read": true, 00:19:27.833 "write": true, 00:19:27.833 "unmap": true, 00:19:27.833 "flush": true, 00:19:27.833 "reset": true, 00:19:27.833 "nvme_admin": false, 00:19:27.833 "nvme_io": false, 00:19:27.833 "nvme_io_md": false, 00:19:27.833 "write_zeroes": true, 00:19:27.833 "zcopy": true, 00:19:27.833 "get_zone_info": false, 00:19:27.833 "zone_management": false, 00:19:27.833 "zone_append": false, 00:19:27.833 "compare": false, 00:19:27.833 "compare_and_write": false, 00:19:27.833 "abort": true, 00:19:27.833 "seek_hole": false, 00:19:27.833 "seek_data": false, 00:19:27.833 "copy": true, 00:19:27.833 "nvme_iov_md": false 00:19:27.833 }, 00:19:27.833 "memory_domains": [ 00:19:27.833 { 00:19:27.833 "dma_device_id": "system", 00:19:27.833 "dma_device_type": 1 00:19:27.833 }, 00:19:27.833 { 00:19:27.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.833 "dma_device_type": 2 00:19:27.833 } 00:19:27.833 ], 00:19:27.833 "driver_specific": {} 00:19:27.833 } 00:19:27.833 ] 00:19:27.833 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:27.833 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:27.833 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:27.833 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:28.092 BaseBdev3 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.092 09:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.351 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:28.611 [ 00:19:28.611 { 00:19:28.611 "name": "BaseBdev3", 00:19:28.611 "aliases": [ 00:19:28.611 "e8d7de2d-2a68-45fb-b043-d046ea96addc" 00:19:28.611 ], 00:19:28.611 "product_name": "Malloc disk", 00:19:28.611 "block_size": 512, 00:19:28.611 "num_blocks": 65536, 00:19:28.611 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:28.611 "assigned_rate_limits": { 00:19:28.611 "rw_ios_per_sec": 0, 00:19:28.611 "rw_mbytes_per_sec": 0, 00:19:28.611 "r_mbytes_per_sec": 0, 00:19:28.611 "w_mbytes_per_sec": 0 00:19:28.611 }, 00:19:28.611 "claimed": false, 00:19:28.611 "zoned": false, 00:19:28.611 "supported_io_types": { 00:19:28.611 "read": true, 00:19:28.611 "write": true, 00:19:28.611 "unmap": true, 00:19:28.611 "flush": true, 00:19:28.611 "reset": true, 00:19:28.611 "nvme_admin": false, 00:19:28.611 "nvme_io": false, 00:19:28.611 "nvme_io_md": false, 00:19:28.611 "write_zeroes": true, 00:19:28.611 "zcopy": true, 00:19:28.611 "get_zone_info": false, 00:19:28.611 "zone_management": false, 00:19:28.611 "zone_append": false, 00:19:28.611 "compare": false, 00:19:28.611 "compare_and_write": false, 00:19:28.611 "abort": true, 00:19:28.611 "seek_hole": false, 00:19:28.611 "seek_data": false, 00:19:28.611 "copy": true, 00:19:28.611 "nvme_iov_md": false 00:19:28.611 }, 00:19:28.611 "memory_domains": [ 00:19:28.611 { 00:19:28.611 "dma_device_id": "system", 00:19:28.611 "dma_device_type": 1 00:19:28.611 }, 00:19:28.611 { 00:19:28.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.611 "dma_device_type": 2 00:19:28.611 } 00:19:28.611 ], 00:19:28.611 "driver_specific": {} 00:19:28.611 } 00:19:28.611 ] 00:19:28.611 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:28.611 09:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:28.611 09:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.611 09:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:28.871 BaseBdev4 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.871 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.130 09:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:29.389 [ 00:19:29.389 { 00:19:29.389 "name": "BaseBdev4", 00:19:29.389 "aliases": [ 00:19:29.389 "c7a8c587-696a-44fc-a5c5-897415363011" 00:19:29.389 ], 00:19:29.389 "product_name": "Malloc disk", 00:19:29.389 "block_size": 512, 00:19:29.389 "num_blocks": 65536, 00:19:29.389 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:29.389 "assigned_rate_limits": { 00:19:29.389 "rw_ios_per_sec": 0, 00:19:29.389 "rw_mbytes_per_sec": 0, 00:19:29.389 "r_mbytes_per_sec": 0, 00:19:29.389 "w_mbytes_per_sec": 0 00:19:29.389 }, 00:19:29.389 "claimed": false, 00:19:29.389 "zoned": false, 00:19:29.389 "supported_io_types": { 00:19:29.389 "read": true, 00:19:29.389 "write": true, 00:19:29.389 "unmap": true, 00:19:29.389 "flush": true, 00:19:29.389 "reset": true, 00:19:29.389 "nvme_admin": false, 00:19:29.389 "nvme_io": false, 00:19:29.389 "nvme_io_md": false, 00:19:29.389 "write_zeroes": true, 00:19:29.389 "zcopy": true, 00:19:29.389 "get_zone_info": false, 00:19:29.389 "zone_management": false, 00:19:29.389 "zone_append": false, 00:19:29.389 "compare": false, 00:19:29.389 "compare_and_write": false, 00:19:29.389 "abort": true, 00:19:29.389 "seek_hole": false, 00:19:29.389 "seek_data": false, 00:19:29.389 "copy": true, 00:19:29.389 "nvme_iov_md": false 00:19:29.389 }, 00:19:29.389 "memory_domains": [ 00:19:29.389 { 00:19:29.389 "dma_device_id": "system", 00:19:29.389 "dma_device_type": 1 00:19:29.389 }, 00:19:29.389 { 00:19:29.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.389 "dma_device_type": 2 00:19:29.389 } 00:19:29.389 ], 00:19:29.389 "driver_specific": {} 00:19:29.389 } 00:19:29.389 ] 00:19:29.389 09:23:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:29.389 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.389 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.389 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:29.648 [2024-07-15 09:23:38.434440] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:29.648 [2024-07-15 09:23:38.434486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:29.648 [2024-07-15 09:23:38.434505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:29.649 [2024-07-15 09:23:38.435827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:29.649 [2024-07-15 09:23:38.435870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.649 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.908 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.908 "name": "Existed_Raid", 00:19:29.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.908 "strip_size_kb": 64, 00:19:29.908 "state": "configuring", 00:19:29.908 "raid_level": "concat", 00:19:29.908 "superblock": false, 00:19:29.908 "num_base_bdevs": 4, 00:19:29.908 "num_base_bdevs_discovered": 3, 00:19:29.908 "num_base_bdevs_operational": 4, 00:19:29.908 "base_bdevs_list": [ 00:19:29.908 { 00:19:29.908 "name": "BaseBdev1", 00:19:29.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.908 "is_configured": false, 00:19:29.908 "data_offset": 0, 00:19:29.908 "data_size": 0 00:19:29.908 }, 00:19:29.908 { 00:19:29.908 "name": "BaseBdev2", 00:19:29.908 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:29.908 "is_configured": true, 00:19:29.908 "data_offset": 0, 00:19:29.908 "data_size": 65536 00:19:29.908 }, 00:19:29.908 { 00:19:29.908 "name": "BaseBdev3", 00:19:29.908 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:29.908 "is_configured": true, 00:19:29.908 "data_offset": 0, 00:19:29.908 "data_size": 65536 00:19:29.908 }, 00:19:29.908 { 00:19:29.908 "name": "BaseBdev4", 00:19:29.908 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:29.908 "is_configured": true, 00:19:29.908 "data_offset": 0, 00:19:29.908 "data_size": 65536 00:19:29.908 } 00:19:29.908 ] 00:19:29.908 }' 00:19:29.908 09:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.908 09:23:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:30.845 [2024-07-15 09:23:39.697772] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.845 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.104 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.104 "name": "Existed_Raid", 00:19:31.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.104 "strip_size_kb": 64, 00:19:31.104 "state": "configuring", 00:19:31.104 "raid_level": "concat", 00:19:31.104 "superblock": false, 00:19:31.104 "num_base_bdevs": 4, 00:19:31.104 "num_base_bdevs_discovered": 2, 00:19:31.104 "num_base_bdevs_operational": 4, 00:19:31.104 "base_bdevs_list": [ 00:19:31.104 { 00:19:31.104 "name": "BaseBdev1", 00:19:31.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.104 "is_configured": false, 00:19:31.104 "data_offset": 0, 00:19:31.104 "data_size": 0 00:19:31.104 }, 00:19:31.104 { 00:19:31.104 "name": null, 00:19:31.104 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:31.104 "is_configured": false, 00:19:31.104 "data_offset": 0, 00:19:31.104 "data_size": 65536 00:19:31.104 }, 00:19:31.104 { 00:19:31.104 "name": "BaseBdev3", 00:19:31.104 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:31.104 "is_configured": true, 00:19:31.104 "data_offset": 0, 00:19:31.104 "data_size": 65536 00:19:31.104 }, 00:19:31.104 { 00:19:31.104 "name": "BaseBdev4", 00:19:31.104 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:31.104 "is_configured": true, 00:19:31.104 "data_offset": 0, 00:19:31.104 "data_size": 65536 00:19:31.104 } 00:19:31.104 ] 00:19:31.104 }' 00:19:31.104 09:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.104 09:23:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.671 09:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.671 09:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:31.930 09:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:31.930 09:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:32.191 [2024-07-15 09:23:40.885491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:32.191 BaseBdev1 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:32.191 09:23:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:32.450 [ 00:19:32.450 { 00:19:32.450 "name": "BaseBdev1", 00:19:32.450 "aliases": [ 00:19:32.450 "61394e58-744f-4dff-84a9-2b22b861931b" 00:19:32.450 ], 00:19:32.450 "product_name": "Malloc disk", 00:19:32.450 "block_size": 512, 00:19:32.450 "num_blocks": 65536, 00:19:32.450 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:32.450 "assigned_rate_limits": { 00:19:32.450 "rw_ios_per_sec": 0, 00:19:32.450 "rw_mbytes_per_sec": 0, 00:19:32.450 "r_mbytes_per_sec": 0, 00:19:32.450 "w_mbytes_per_sec": 0 00:19:32.450 }, 00:19:32.450 "claimed": true, 00:19:32.450 "claim_type": "exclusive_write", 00:19:32.450 "zoned": false, 00:19:32.450 "supported_io_types": { 00:19:32.450 "read": true, 00:19:32.450 "write": true, 00:19:32.450 "unmap": true, 00:19:32.450 "flush": true, 00:19:32.450 "reset": true, 00:19:32.450 "nvme_admin": false, 00:19:32.450 "nvme_io": false, 00:19:32.450 "nvme_io_md": false, 00:19:32.450 "write_zeroes": true, 00:19:32.450 "zcopy": true, 00:19:32.450 "get_zone_info": false, 00:19:32.450 "zone_management": false, 00:19:32.450 "zone_append": false, 00:19:32.450 "compare": false, 00:19:32.450 "compare_and_write": false, 00:19:32.450 "abort": true, 00:19:32.450 "seek_hole": false, 00:19:32.450 "seek_data": false, 00:19:32.450 "copy": true, 00:19:32.450 "nvme_iov_md": false 00:19:32.450 }, 00:19:32.450 "memory_domains": [ 00:19:32.450 { 00:19:32.450 "dma_device_id": "system", 00:19:32.450 "dma_device_type": 1 00:19:32.450 }, 00:19:32.450 { 00:19:32.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.450 "dma_device_type": 2 00:19:32.450 } 00:19:32.450 ], 00:19:32.450 "driver_specific": {} 00:19:32.450 } 00:19:32.450 ] 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.450 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.709 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.709 "name": "Existed_Raid", 00:19:32.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.709 "strip_size_kb": 64, 00:19:32.709 "state": "configuring", 00:19:32.709 "raid_level": "concat", 00:19:32.709 "superblock": false, 00:19:32.709 "num_base_bdevs": 4, 00:19:32.709 "num_base_bdevs_discovered": 3, 00:19:32.709 "num_base_bdevs_operational": 4, 00:19:32.709 "base_bdevs_list": [ 00:19:32.709 { 00:19:32.709 "name": "BaseBdev1", 00:19:32.709 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:32.709 "is_configured": true, 00:19:32.709 "data_offset": 0, 00:19:32.709 "data_size": 65536 00:19:32.709 }, 00:19:32.709 { 00:19:32.709 "name": null, 00:19:32.709 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:32.709 "is_configured": false, 00:19:32.709 "data_offset": 0, 00:19:32.709 "data_size": 65536 00:19:32.709 }, 00:19:32.709 { 00:19:32.709 "name": "BaseBdev3", 00:19:32.709 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:32.709 "is_configured": true, 00:19:32.709 "data_offset": 0, 00:19:32.709 "data_size": 65536 00:19:32.709 }, 00:19:32.709 { 00:19:32.709 "name": "BaseBdev4", 00:19:32.709 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:32.709 "is_configured": true, 00:19:32.709 "data_offset": 0, 00:19:32.709 "data_size": 65536 00:19:32.709 } 00:19:32.709 ] 00:19:32.709 }' 00:19:32.709 09:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.709 09:23:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.277 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.277 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:33.536 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:33.536 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:34.103 [2024-07-15 09:23:42.930919] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.103 09:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.361 09:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.361 "name": "Existed_Raid", 00:19:34.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.361 "strip_size_kb": 64, 00:19:34.361 "state": "configuring", 00:19:34.361 "raid_level": "concat", 00:19:34.361 "superblock": false, 00:19:34.361 "num_base_bdevs": 4, 00:19:34.361 "num_base_bdevs_discovered": 2, 00:19:34.361 "num_base_bdevs_operational": 4, 00:19:34.361 "base_bdevs_list": [ 00:19:34.361 { 00:19:34.361 "name": "BaseBdev1", 00:19:34.361 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:34.361 "is_configured": true, 00:19:34.361 "data_offset": 0, 00:19:34.361 "data_size": 65536 00:19:34.361 }, 00:19:34.361 { 00:19:34.361 "name": null, 00:19:34.361 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:34.361 "is_configured": false, 00:19:34.361 "data_offset": 0, 00:19:34.362 "data_size": 65536 00:19:34.362 }, 00:19:34.362 { 00:19:34.362 "name": null, 00:19:34.362 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:34.362 "is_configured": false, 00:19:34.362 "data_offset": 0, 00:19:34.362 "data_size": 65536 00:19:34.362 }, 00:19:34.362 { 00:19:34.362 "name": "BaseBdev4", 00:19:34.362 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:34.362 "is_configured": true, 00:19:34.362 "data_offset": 0, 00:19:34.362 "data_size": 65536 00:19:34.362 } 00:19:34.362 ] 00:19:34.362 }' 00:19:34.362 09:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.362 09:23:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.927 09:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.927 09:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:35.184 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:35.184 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:35.466 [2024-07-15 09:23:44.282501] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.466 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.724 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.724 "name": "Existed_Raid", 00:19:35.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.724 "strip_size_kb": 64, 00:19:35.724 "state": "configuring", 00:19:35.724 "raid_level": "concat", 00:19:35.724 "superblock": false, 00:19:35.724 "num_base_bdevs": 4, 00:19:35.724 "num_base_bdevs_discovered": 3, 00:19:35.725 "num_base_bdevs_operational": 4, 00:19:35.725 "base_bdevs_list": [ 00:19:35.725 { 00:19:35.725 "name": "BaseBdev1", 00:19:35.725 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:35.725 "is_configured": true, 00:19:35.725 "data_offset": 0, 00:19:35.725 "data_size": 65536 00:19:35.725 }, 00:19:35.725 { 00:19:35.725 "name": null, 00:19:35.725 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:35.725 "is_configured": false, 00:19:35.725 "data_offset": 0, 00:19:35.725 "data_size": 65536 00:19:35.725 }, 00:19:35.725 { 00:19:35.725 "name": "BaseBdev3", 00:19:35.725 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:35.725 "is_configured": true, 00:19:35.725 "data_offset": 0, 00:19:35.725 "data_size": 65536 00:19:35.725 }, 00:19:35.725 { 00:19:35.725 "name": "BaseBdev4", 00:19:35.725 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:35.725 "is_configured": true, 00:19:35.725 "data_offset": 0, 00:19:35.725 "data_size": 65536 00:19:35.725 } 00:19:35.725 ] 00:19:35.725 }' 00:19:35.725 09:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.725 09:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.658 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:36.658 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.916 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:36.916 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:37.176 [2024-07-15 09:23:45.886816] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.176 09:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.435 09:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.435 "name": "Existed_Raid", 00:19:37.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.435 "strip_size_kb": 64, 00:19:37.435 "state": "configuring", 00:19:37.435 "raid_level": "concat", 00:19:37.435 "superblock": false, 00:19:37.435 "num_base_bdevs": 4, 00:19:37.435 "num_base_bdevs_discovered": 2, 00:19:37.435 "num_base_bdevs_operational": 4, 00:19:37.435 "base_bdevs_list": [ 00:19:37.435 { 00:19:37.435 "name": null, 00:19:37.435 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:37.435 "is_configured": false, 00:19:37.435 "data_offset": 0, 00:19:37.435 "data_size": 65536 00:19:37.435 }, 00:19:37.435 { 00:19:37.435 "name": null, 00:19:37.435 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:37.435 "is_configured": false, 00:19:37.435 "data_offset": 0, 00:19:37.435 "data_size": 65536 00:19:37.435 }, 00:19:37.435 { 00:19:37.435 "name": "BaseBdev3", 00:19:37.435 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:37.435 "is_configured": true, 00:19:37.435 "data_offset": 0, 00:19:37.435 "data_size": 65536 00:19:37.435 }, 00:19:37.435 { 00:19:37.435 "name": "BaseBdev4", 00:19:37.435 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:37.435 "is_configured": true, 00:19:37.435 "data_offset": 0, 00:19:37.435 "data_size": 65536 00:19:37.435 } 00:19:37.435 ] 00:19:37.435 }' 00:19:37.435 09:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.435 09:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.371 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.371 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:38.371 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:38.371 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:38.630 [2024-07-15 09:23:47.510308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.630 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.889 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.889 "name": "Existed_Raid", 00:19:38.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.889 "strip_size_kb": 64, 00:19:38.889 "state": "configuring", 00:19:38.889 "raid_level": "concat", 00:19:38.889 "superblock": false, 00:19:38.889 "num_base_bdevs": 4, 00:19:38.889 "num_base_bdevs_discovered": 3, 00:19:38.889 "num_base_bdevs_operational": 4, 00:19:38.889 "base_bdevs_list": [ 00:19:38.889 { 00:19:38.889 "name": null, 00:19:38.889 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:38.889 "is_configured": false, 00:19:38.889 "data_offset": 0, 00:19:38.889 "data_size": 65536 00:19:38.889 }, 00:19:38.889 { 00:19:38.889 "name": "BaseBdev2", 00:19:38.889 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:38.889 "is_configured": true, 00:19:38.889 "data_offset": 0, 00:19:38.890 "data_size": 65536 00:19:38.890 }, 00:19:38.890 { 00:19:38.890 "name": "BaseBdev3", 00:19:38.890 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:38.890 "is_configured": true, 00:19:38.890 "data_offset": 0, 00:19:38.890 "data_size": 65536 00:19:38.890 }, 00:19:38.890 { 00:19:38.890 "name": "BaseBdev4", 00:19:38.890 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:38.890 "is_configured": true, 00:19:38.890 "data_offset": 0, 00:19:38.890 "data_size": 65536 00:19:38.890 } 00:19:38.890 ] 00:19:38.890 }' 00:19:38.890 09:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.890 09:23:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.458 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.458 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:39.717 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:39.717 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.717 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:39.976 09:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 61394e58-744f-4dff-84a9-2b22b861931b 00:19:40.236 [2024-07-15 09:23:49.089791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:40.236 [2024-07-15 09:23:49.089830] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e5040 00:19:40.236 [2024-07-15 09:23:49.089839] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:40.236 [2024-07-15 09:23:49.090049] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e0a70 00:19:40.236 [2024-07-15 09:23:49.090168] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e5040 00:19:40.236 [2024-07-15 09:23:49.090178] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e5040 00:19:40.236 [2024-07-15 09:23:49.090340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.236 NewBaseBdev 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:40.236 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:40.540 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:40.817 [ 00:19:40.817 { 00:19:40.817 "name": "NewBaseBdev", 00:19:40.817 "aliases": [ 00:19:40.817 "61394e58-744f-4dff-84a9-2b22b861931b" 00:19:40.817 ], 00:19:40.817 "product_name": "Malloc disk", 00:19:40.817 "block_size": 512, 00:19:40.817 "num_blocks": 65536, 00:19:40.817 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:40.817 "assigned_rate_limits": { 00:19:40.817 "rw_ios_per_sec": 0, 00:19:40.817 "rw_mbytes_per_sec": 0, 00:19:40.817 "r_mbytes_per_sec": 0, 00:19:40.817 "w_mbytes_per_sec": 0 00:19:40.817 }, 00:19:40.817 "claimed": true, 00:19:40.817 "claim_type": "exclusive_write", 00:19:40.817 "zoned": false, 00:19:40.817 "supported_io_types": { 00:19:40.817 "read": true, 00:19:40.817 "write": true, 00:19:40.817 "unmap": true, 00:19:40.817 "flush": true, 00:19:40.817 "reset": true, 00:19:40.817 "nvme_admin": false, 00:19:40.817 "nvme_io": false, 00:19:40.817 "nvme_io_md": false, 00:19:40.817 "write_zeroes": true, 00:19:40.817 "zcopy": true, 00:19:40.817 "get_zone_info": false, 00:19:40.817 "zone_management": false, 00:19:40.817 "zone_append": false, 00:19:40.817 "compare": false, 00:19:40.817 "compare_and_write": false, 00:19:40.817 "abort": true, 00:19:40.817 "seek_hole": false, 00:19:40.817 "seek_data": false, 00:19:40.817 "copy": true, 00:19:40.817 "nvme_iov_md": false 00:19:40.817 }, 00:19:40.817 "memory_domains": [ 00:19:40.817 { 00:19:40.817 "dma_device_id": "system", 00:19:40.817 "dma_device_type": 1 00:19:40.817 }, 00:19:40.817 { 00:19:40.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.817 "dma_device_type": 2 00:19:40.817 } 00:19:40.817 ], 00:19:40.817 "driver_specific": {} 00:19:40.817 } 00:19:40.817 ] 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.817 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.077 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.077 "name": "Existed_Raid", 00:19:41.077 "uuid": "d41bd591-ec22-4ea5-bd13-fd7d054b4a02", 00:19:41.077 "strip_size_kb": 64, 00:19:41.077 "state": "online", 00:19:41.077 "raid_level": "concat", 00:19:41.077 "superblock": false, 00:19:41.077 "num_base_bdevs": 4, 00:19:41.077 "num_base_bdevs_discovered": 4, 00:19:41.077 "num_base_bdevs_operational": 4, 00:19:41.077 "base_bdevs_list": [ 00:19:41.077 { 00:19:41.077 "name": "NewBaseBdev", 00:19:41.077 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:41.077 "is_configured": true, 00:19:41.077 "data_offset": 0, 00:19:41.077 "data_size": 65536 00:19:41.077 }, 00:19:41.077 { 00:19:41.077 "name": "BaseBdev2", 00:19:41.077 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:41.077 "is_configured": true, 00:19:41.077 "data_offset": 0, 00:19:41.077 "data_size": 65536 00:19:41.077 }, 00:19:41.077 { 00:19:41.077 "name": "BaseBdev3", 00:19:41.077 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:41.077 "is_configured": true, 00:19:41.077 "data_offset": 0, 00:19:41.077 "data_size": 65536 00:19:41.077 }, 00:19:41.077 { 00:19:41.077 "name": "BaseBdev4", 00:19:41.077 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:41.077 "is_configured": true, 00:19:41.077 "data_offset": 0, 00:19:41.077 "data_size": 65536 00:19:41.077 } 00:19:41.077 ] 00:19:41.077 }' 00:19:41.078 09:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.078 09:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:41.646 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:41.905 [2024-07-15 09:23:50.654269] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:41.905 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:41.905 "name": "Existed_Raid", 00:19:41.905 "aliases": [ 00:19:41.905 "d41bd591-ec22-4ea5-bd13-fd7d054b4a02" 00:19:41.905 ], 00:19:41.905 "product_name": "Raid Volume", 00:19:41.905 "block_size": 512, 00:19:41.905 "num_blocks": 262144, 00:19:41.905 "uuid": "d41bd591-ec22-4ea5-bd13-fd7d054b4a02", 00:19:41.905 "assigned_rate_limits": { 00:19:41.905 "rw_ios_per_sec": 0, 00:19:41.905 "rw_mbytes_per_sec": 0, 00:19:41.905 "r_mbytes_per_sec": 0, 00:19:41.905 "w_mbytes_per_sec": 0 00:19:41.905 }, 00:19:41.905 "claimed": false, 00:19:41.905 "zoned": false, 00:19:41.905 "supported_io_types": { 00:19:41.905 "read": true, 00:19:41.905 "write": true, 00:19:41.905 "unmap": true, 00:19:41.905 "flush": true, 00:19:41.905 "reset": true, 00:19:41.905 "nvme_admin": false, 00:19:41.905 "nvme_io": false, 00:19:41.905 "nvme_io_md": false, 00:19:41.905 "write_zeroes": true, 00:19:41.905 "zcopy": false, 00:19:41.906 "get_zone_info": false, 00:19:41.906 "zone_management": false, 00:19:41.906 "zone_append": false, 00:19:41.906 "compare": false, 00:19:41.906 "compare_and_write": false, 00:19:41.906 "abort": false, 00:19:41.906 "seek_hole": false, 00:19:41.906 "seek_data": false, 00:19:41.906 "copy": false, 00:19:41.906 "nvme_iov_md": false 00:19:41.906 }, 00:19:41.906 "memory_domains": [ 00:19:41.906 { 00:19:41.906 "dma_device_id": "system", 00:19:41.906 "dma_device_type": 1 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.906 "dma_device_type": 2 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "system", 00:19:41.906 "dma_device_type": 1 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.906 "dma_device_type": 2 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "system", 00:19:41.906 "dma_device_type": 1 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.906 "dma_device_type": 2 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "system", 00:19:41.906 "dma_device_type": 1 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.906 "dma_device_type": 2 00:19:41.906 } 00:19:41.906 ], 00:19:41.906 "driver_specific": { 00:19:41.906 "raid": { 00:19:41.906 "uuid": "d41bd591-ec22-4ea5-bd13-fd7d054b4a02", 00:19:41.906 "strip_size_kb": 64, 00:19:41.906 "state": "online", 00:19:41.906 "raid_level": "concat", 00:19:41.906 "superblock": false, 00:19:41.906 "num_base_bdevs": 4, 00:19:41.906 "num_base_bdevs_discovered": 4, 00:19:41.906 "num_base_bdevs_operational": 4, 00:19:41.906 "base_bdevs_list": [ 00:19:41.906 { 00:19:41.906 "name": "NewBaseBdev", 00:19:41.906 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:41.906 "is_configured": true, 00:19:41.906 "data_offset": 0, 00:19:41.906 "data_size": 65536 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "name": "BaseBdev2", 00:19:41.906 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:41.906 "is_configured": true, 00:19:41.906 "data_offset": 0, 00:19:41.906 "data_size": 65536 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "name": "BaseBdev3", 00:19:41.906 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:41.906 "is_configured": true, 00:19:41.906 "data_offset": 0, 00:19:41.906 "data_size": 65536 00:19:41.906 }, 00:19:41.906 { 00:19:41.906 "name": "BaseBdev4", 00:19:41.906 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:41.906 "is_configured": true, 00:19:41.906 "data_offset": 0, 00:19:41.906 "data_size": 65536 00:19:41.906 } 00:19:41.906 ] 00:19:41.906 } 00:19:41.906 } 00:19:41.906 }' 00:19:41.906 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:41.906 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:41.906 BaseBdev2 00:19:41.906 BaseBdev3 00:19:41.906 BaseBdev4' 00:19:41.906 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.906 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:41.906 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.166 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.166 "name": "NewBaseBdev", 00:19:42.166 "aliases": [ 00:19:42.166 "61394e58-744f-4dff-84a9-2b22b861931b" 00:19:42.166 ], 00:19:42.166 "product_name": "Malloc disk", 00:19:42.166 "block_size": 512, 00:19:42.166 "num_blocks": 65536, 00:19:42.166 "uuid": "61394e58-744f-4dff-84a9-2b22b861931b", 00:19:42.166 "assigned_rate_limits": { 00:19:42.166 "rw_ios_per_sec": 0, 00:19:42.166 "rw_mbytes_per_sec": 0, 00:19:42.166 "r_mbytes_per_sec": 0, 00:19:42.166 "w_mbytes_per_sec": 0 00:19:42.166 }, 00:19:42.166 "claimed": true, 00:19:42.166 "claim_type": "exclusive_write", 00:19:42.166 "zoned": false, 00:19:42.166 "supported_io_types": { 00:19:42.166 "read": true, 00:19:42.166 "write": true, 00:19:42.166 "unmap": true, 00:19:42.166 "flush": true, 00:19:42.166 "reset": true, 00:19:42.166 "nvme_admin": false, 00:19:42.166 "nvme_io": false, 00:19:42.166 "nvme_io_md": false, 00:19:42.166 "write_zeroes": true, 00:19:42.166 "zcopy": true, 00:19:42.166 "get_zone_info": false, 00:19:42.166 "zone_management": false, 00:19:42.166 "zone_append": false, 00:19:42.166 "compare": false, 00:19:42.166 "compare_and_write": false, 00:19:42.166 "abort": true, 00:19:42.166 "seek_hole": false, 00:19:42.166 "seek_data": false, 00:19:42.166 "copy": true, 00:19:42.166 "nvme_iov_md": false 00:19:42.166 }, 00:19:42.166 "memory_domains": [ 00:19:42.166 { 00:19:42.166 "dma_device_id": "system", 00:19:42.166 "dma_device_type": 1 00:19:42.166 }, 00:19:42.166 { 00:19:42.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.166 "dma_device_type": 2 00:19:42.166 } 00:19:42.166 ], 00:19:42.166 "driver_specific": {} 00:19:42.166 }' 00:19:42.166 09:23:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.166 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.166 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.166 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.166 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:42.425 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.683 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.683 "name": "BaseBdev2", 00:19:42.683 "aliases": [ 00:19:42.683 "15624a6c-8875-4b6b-a2be-9eac8854a168" 00:19:42.683 ], 00:19:42.683 "product_name": "Malloc disk", 00:19:42.683 "block_size": 512, 00:19:42.683 "num_blocks": 65536, 00:19:42.683 "uuid": "15624a6c-8875-4b6b-a2be-9eac8854a168", 00:19:42.683 "assigned_rate_limits": { 00:19:42.683 "rw_ios_per_sec": 0, 00:19:42.683 "rw_mbytes_per_sec": 0, 00:19:42.683 "r_mbytes_per_sec": 0, 00:19:42.683 "w_mbytes_per_sec": 0 00:19:42.683 }, 00:19:42.683 "claimed": true, 00:19:42.683 "claim_type": "exclusive_write", 00:19:42.683 "zoned": false, 00:19:42.683 "supported_io_types": { 00:19:42.683 "read": true, 00:19:42.684 "write": true, 00:19:42.684 "unmap": true, 00:19:42.684 "flush": true, 00:19:42.684 "reset": true, 00:19:42.684 "nvme_admin": false, 00:19:42.684 "nvme_io": false, 00:19:42.684 "nvme_io_md": false, 00:19:42.684 "write_zeroes": true, 00:19:42.684 "zcopy": true, 00:19:42.684 "get_zone_info": false, 00:19:42.684 "zone_management": false, 00:19:42.684 "zone_append": false, 00:19:42.684 "compare": false, 00:19:42.684 "compare_and_write": false, 00:19:42.684 "abort": true, 00:19:42.684 "seek_hole": false, 00:19:42.684 "seek_data": false, 00:19:42.684 "copy": true, 00:19:42.684 "nvme_iov_md": false 00:19:42.684 }, 00:19:42.684 "memory_domains": [ 00:19:42.684 { 00:19:42.684 "dma_device_id": "system", 00:19:42.684 "dma_device_type": 1 00:19:42.684 }, 00:19:42.684 { 00:19:42.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.684 "dma_device_type": 2 00:19:42.684 } 00:19:42.684 ], 00:19:42.684 "driver_specific": {} 00:19:42.684 }' 00:19:42.684 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.684 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.684 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.684 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.684 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:42.942 09:23:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.201 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.201 "name": "BaseBdev3", 00:19:43.201 "aliases": [ 00:19:43.201 "e8d7de2d-2a68-45fb-b043-d046ea96addc" 00:19:43.201 ], 00:19:43.201 "product_name": "Malloc disk", 00:19:43.201 "block_size": 512, 00:19:43.201 "num_blocks": 65536, 00:19:43.201 "uuid": "e8d7de2d-2a68-45fb-b043-d046ea96addc", 00:19:43.201 "assigned_rate_limits": { 00:19:43.201 "rw_ios_per_sec": 0, 00:19:43.201 "rw_mbytes_per_sec": 0, 00:19:43.201 "r_mbytes_per_sec": 0, 00:19:43.201 "w_mbytes_per_sec": 0 00:19:43.201 }, 00:19:43.201 "claimed": true, 00:19:43.201 "claim_type": "exclusive_write", 00:19:43.201 "zoned": false, 00:19:43.201 "supported_io_types": { 00:19:43.201 "read": true, 00:19:43.201 "write": true, 00:19:43.201 "unmap": true, 00:19:43.201 "flush": true, 00:19:43.201 "reset": true, 00:19:43.201 "nvme_admin": false, 00:19:43.201 "nvme_io": false, 00:19:43.201 "nvme_io_md": false, 00:19:43.201 "write_zeroes": true, 00:19:43.201 "zcopy": true, 00:19:43.201 "get_zone_info": false, 00:19:43.201 "zone_management": false, 00:19:43.201 "zone_append": false, 00:19:43.201 "compare": false, 00:19:43.201 "compare_and_write": false, 00:19:43.201 "abort": true, 00:19:43.201 "seek_hole": false, 00:19:43.201 "seek_data": false, 00:19:43.201 "copy": true, 00:19:43.201 "nvme_iov_md": false 00:19:43.201 }, 00:19:43.201 "memory_domains": [ 00:19:43.201 { 00:19:43.201 "dma_device_id": "system", 00:19:43.201 "dma_device_type": 1 00:19:43.201 }, 00:19:43.201 { 00:19:43.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.201 "dma_device_type": 2 00:19:43.201 } 00:19:43.201 ], 00:19:43.201 "driver_specific": {} 00:19:43.201 }' 00:19:43.201 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.201 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.460 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.719 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.719 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.719 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:43.719 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.978 "name": "BaseBdev4", 00:19:43.978 "aliases": [ 00:19:43.978 "c7a8c587-696a-44fc-a5c5-897415363011" 00:19:43.978 ], 00:19:43.978 "product_name": "Malloc disk", 00:19:43.978 "block_size": 512, 00:19:43.978 "num_blocks": 65536, 00:19:43.978 "uuid": "c7a8c587-696a-44fc-a5c5-897415363011", 00:19:43.978 "assigned_rate_limits": { 00:19:43.978 "rw_ios_per_sec": 0, 00:19:43.978 "rw_mbytes_per_sec": 0, 00:19:43.978 "r_mbytes_per_sec": 0, 00:19:43.978 "w_mbytes_per_sec": 0 00:19:43.978 }, 00:19:43.978 "claimed": true, 00:19:43.978 "claim_type": "exclusive_write", 00:19:43.978 "zoned": false, 00:19:43.978 "supported_io_types": { 00:19:43.978 "read": true, 00:19:43.978 "write": true, 00:19:43.978 "unmap": true, 00:19:43.978 "flush": true, 00:19:43.978 "reset": true, 00:19:43.978 "nvme_admin": false, 00:19:43.978 "nvme_io": false, 00:19:43.978 "nvme_io_md": false, 00:19:43.978 "write_zeroes": true, 00:19:43.978 "zcopy": true, 00:19:43.978 "get_zone_info": false, 00:19:43.978 "zone_management": false, 00:19:43.978 "zone_append": false, 00:19:43.978 "compare": false, 00:19:43.978 "compare_and_write": false, 00:19:43.978 "abort": true, 00:19:43.978 "seek_hole": false, 00:19:43.978 "seek_data": false, 00:19:43.978 "copy": true, 00:19:43.978 "nvme_iov_md": false 00:19:43.978 }, 00:19:43.978 "memory_domains": [ 00:19:43.978 { 00:19:43.978 "dma_device_id": "system", 00:19:43.978 "dma_device_type": 1 00:19:43.978 }, 00:19:43.978 { 00:19:43.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.978 "dma_device_type": 2 00:19:43.978 } 00:19:43.978 ], 00:19:43.978 "driver_specific": {} 00:19:43.978 }' 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.978 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.236 09:23:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.236 09:23:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.236 09:23:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:44.495 [2024-07-15 09:23:53.236847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:44.495 [2024-07-15 09:23:53.236877] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:44.495 [2024-07-15 09:23:53.236942] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:44.495 [2024-07-15 09:23:53.237003] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:44.495 [2024-07-15 09:23:53.237017] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e5040 name Existed_Raid, state offline 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 159629 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 159629 ']' 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 159629 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 159629 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 159629' 00:19:44.495 killing process with pid 159629 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 159629 00:19:44.495 [2024-07-15 09:23:53.304853] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:44.495 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 159629 00:19:44.495 [2024-07-15 09:23:53.347336] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:44.755 00:19:44.755 real 0m32.738s 00:19:44.755 user 1m0.129s 00:19:44.755 sys 0m5.777s 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.755 ************************************ 00:19:44.755 END TEST raid_state_function_test 00:19:44.755 ************************************ 00:19:44.755 09:23:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:44.755 09:23:53 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:44.755 09:23:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:44.755 09:23:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:44.755 09:23:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:44.755 ************************************ 00:19:44.755 START TEST raid_state_function_test_sb 00:19:44.755 ************************************ 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=164517 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 164517' 00:19:44.755 Process raid pid: 164517 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 164517 /var/tmp/spdk-raid.sock 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 164517 ']' 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:44.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:44.755 09:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.014 [2024-07-15 09:23:53.717235] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:19:45.014 [2024-07-15 09:23:53.717302] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:45.014 [2024-07-15 09:23:53.847340] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:45.014 [2024-07-15 09:23:53.953688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:45.272 [2024-07-15 09:23:54.020354] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.272 [2024-07-15 09:23:54.020391] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.839 09:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:45.839 09:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:45.839 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:46.098 [2024-07-15 09:23:54.863920] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:46.098 [2024-07-15 09:23:54.863967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:46.098 [2024-07-15 09:23:54.863978] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:46.098 [2024-07-15 09:23:54.863990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:46.098 [2024-07-15 09:23:54.863999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:46.098 [2024-07-15 09:23:54.864010] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:46.098 [2024-07-15 09:23:54.864019] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:46.098 [2024-07-15 09:23:54.864030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.098 09:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.356 09:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.356 "name": "Existed_Raid", 00:19:46.356 "uuid": "26622531-b7ce-49c4-9ae6-aeccc6a590ce", 00:19:46.356 "strip_size_kb": 64, 00:19:46.356 "state": "configuring", 00:19:46.356 "raid_level": "concat", 00:19:46.356 "superblock": true, 00:19:46.356 "num_base_bdevs": 4, 00:19:46.356 "num_base_bdevs_discovered": 0, 00:19:46.356 "num_base_bdevs_operational": 4, 00:19:46.356 "base_bdevs_list": [ 00:19:46.356 { 00:19:46.356 "name": "BaseBdev1", 00:19:46.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.356 "is_configured": false, 00:19:46.356 "data_offset": 0, 00:19:46.356 "data_size": 0 00:19:46.356 }, 00:19:46.356 { 00:19:46.357 "name": "BaseBdev2", 00:19:46.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.357 "is_configured": false, 00:19:46.357 "data_offset": 0, 00:19:46.357 "data_size": 0 00:19:46.357 }, 00:19:46.357 { 00:19:46.357 "name": "BaseBdev3", 00:19:46.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.357 "is_configured": false, 00:19:46.357 "data_offset": 0, 00:19:46.357 "data_size": 0 00:19:46.357 }, 00:19:46.357 { 00:19:46.357 "name": "BaseBdev4", 00:19:46.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.357 "is_configured": false, 00:19:46.357 "data_offset": 0, 00:19:46.357 "data_size": 0 00:19:46.357 } 00:19:46.357 ] 00:19:46.357 }' 00:19:46.357 09:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.357 09:23:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:46.924 09:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:47.183 [2024-07-15 09:23:55.950629] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:47.183 [2024-07-15 09:23:55.950661] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc7aa0 name Existed_Raid, state configuring 00:19:47.183 09:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:47.442 [2024-07-15 09:23:56.195306] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:47.442 [2024-07-15 09:23:56.195335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:47.442 [2024-07-15 09:23:56.195344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:47.442 [2024-07-15 09:23:56.195356] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:47.442 [2024-07-15 09:23:56.195365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:47.442 [2024-07-15 09:23:56.195376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:47.442 [2024-07-15 09:23:56.195385] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:47.442 [2024-07-15 09:23:56.195396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:47.442 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:47.701 [2024-07-15 09:23:56.445778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:47.701 BaseBdev1 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:47.701 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:47.959 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:48.219 [ 00:19:48.219 { 00:19:48.219 "name": "BaseBdev1", 00:19:48.219 "aliases": [ 00:19:48.219 "36d4b5dd-1c23-42ca-904f-294a639d63d9" 00:19:48.219 ], 00:19:48.219 "product_name": "Malloc disk", 00:19:48.219 "block_size": 512, 00:19:48.219 "num_blocks": 65536, 00:19:48.219 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:48.219 "assigned_rate_limits": { 00:19:48.219 "rw_ios_per_sec": 0, 00:19:48.219 "rw_mbytes_per_sec": 0, 00:19:48.219 "r_mbytes_per_sec": 0, 00:19:48.219 "w_mbytes_per_sec": 0 00:19:48.219 }, 00:19:48.219 "claimed": true, 00:19:48.219 "claim_type": "exclusive_write", 00:19:48.219 "zoned": false, 00:19:48.219 "supported_io_types": { 00:19:48.219 "read": true, 00:19:48.219 "write": true, 00:19:48.219 "unmap": true, 00:19:48.219 "flush": true, 00:19:48.219 "reset": true, 00:19:48.219 "nvme_admin": false, 00:19:48.219 "nvme_io": false, 00:19:48.219 "nvme_io_md": false, 00:19:48.219 "write_zeroes": true, 00:19:48.219 "zcopy": true, 00:19:48.219 "get_zone_info": false, 00:19:48.219 "zone_management": false, 00:19:48.219 "zone_append": false, 00:19:48.219 "compare": false, 00:19:48.219 "compare_and_write": false, 00:19:48.219 "abort": true, 00:19:48.219 "seek_hole": false, 00:19:48.219 "seek_data": false, 00:19:48.219 "copy": true, 00:19:48.219 "nvme_iov_md": false 00:19:48.219 }, 00:19:48.219 "memory_domains": [ 00:19:48.219 { 00:19:48.219 "dma_device_id": "system", 00:19:48.219 "dma_device_type": 1 00:19:48.219 }, 00:19:48.219 { 00:19:48.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.219 "dma_device_type": 2 00:19:48.219 } 00:19:48.219 ], 00:19:48.219 "driver_specific": {} 00:19:48.219 } 00:19:48.219 ] 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.219 09:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.478 09:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.478 "name": "Existed_Raid", 00:19:48.478 "uuid": "8c402b67-4ba9-4c3a-9b2f-dcb08542948b", 00:19:48.478 "strip_size_kb": 64, 00:19:48.478 "state": "configuring", 00:19:48.478 "raid_level": "concat", 00:19:48.478 "superblock": true, 00:19:48.478 "num_base_bdevs": 4, 00:19:48.478 "num_base_bdevs_discovered": 1, 00:19:48.478 "num_base_bdevs_operational": 4, 00:19:48.478 "base_bdevs_list": [ 00:19:48.478 { 00:19:48.478 "name": "BaseBdev1", 00:19:48.478 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:48.478 "is_configured": true, 00:19:48.478 "data_offset": 2048, 00:19:48.478 "data_size": 63488 00:19:48.478 }, 00:19:48.478 { 00:19:48.478 "name": "BaseBdev2", 00:19:48.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.478 "is_configured": false, 00:19:48.478 "data_offset": 0, 00:19:48.478 "data_size": 0 00:19:48.478 }, 00:19:48.478 { 00:19:48.478 "name": "BaseBdev3", 00:19:48.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.478 "is_configured": false, 00:19:48.478 "data_offset": 0, 00:19:48.478 "data_size": 0 00:19:48.478 }, 00:19:48.478 { 00:19:48.478 "name": "BaseBdev4", 00:19:48.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.479 "is_configured": false, 00:19:48.479 "data_offset": 0, 00:19:48.479 "data_size": 0 00:19:48.479 } 00:19:48.479 ] 00:19:48.479 }' 00:19:48.479 09:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.479 09:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.044 09:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:49.303 [2024-07-15 09:23:58.021953] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:49.303 [2024-07-15 09:23:58.021987] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc7310 name Existed_Raid, state configuring 00:19:49.303 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:49.561 [2024-07-15 09:23:58.266642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:49.561 [2024-07-15 09:23:58.268069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:49.561 [2024-07-15 09:23:58.268101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:49.561 [2024-07-15 09:23:58.268111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:49.561 [2024-07-15 09:23:58.268123] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:49.561 [2024-07-15 09:23:58.268132] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:49.561 [2024-07-15 09:23:58.268148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:49.561 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.820 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.820 "name": "Existed_Raid", 00:19:49.820 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:49.820 "strip_size_kb": 64, 00:19:49.820 "state": "configuring", 00:19:49.820 "raid_level": "concat", 00:19:49.820 "superblock": true, 00:19:49.820 "num_base_bdevs": 4, 00:19:49.820 "num_base_bdevs_discovered": 1, 00:19:49.820 "num_base_bdevs_operational": 4, 00:19:49.820 "base_bdevs_list": [ 00:19:49.820 { 00:19:49.820 "name": "BaseBdev1", 00:19:49.820 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:49.820 "is_configured": true, 00:19:49.820 "data_offset": 2048, 00:19:49.820 "data_size": 63488 00:19:49.820 }, 00:19:49.820 { 00:19:49.820 "name": "BaseBdev2", 00:19:49.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.820 "is_configured": false, 00:19:49.820 "data_offset": 0, 00:19:49.820 "data_size": 0 00:19:49.820 }, 00:19:49.820 { 00:19:49.820 "name": "BaseBdev3", 00:19:49.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.820 "is_configured": false, 00:19:49.820 "data_offset": 0, 00:19:49.820 "data_size": 0 00:19:49.820 }, 00:19:49.820 { 00:19:49.820 "name": "BaseBdev4", 00:19:49.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.820 "is_configured": false, 00:19:49.820 "data_offset": 0, 00:19:49.820 "data_size": 0 00:19:49.820 } 00:19:49.820 ] 00:19:49.820 }' 00:19:49.820 09:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.820 09:23:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:50.387 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:50.646 [2024-07-15 09:23:59.364980] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:50.646 BaseBdev2 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:50.646 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:50.905 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:50.905 [ 00:19:50.905 { 00:19:50.905 "name": "BaseBdev2", 00:19:50.905 "aliases": [ 00:19:50.905 "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400" 00:19:50.905 ], 00:19:50.905 "product_name": "Malloc disk", 00:19:50.905 "block_size": 512, 00:19:50.905 "num_blocks": 65536, 00:19:50.905 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:50.905 "assigned_rate_limits": { 00:19:50.905 "rw_ios_per_sec": 0, 00:19:50.905 "rw_mbytes_per_sec": 0, 00:19:50.905 "r_mbytes_per_sec": 0, 00:19:50.905 "w_mbytes_per_sec": 0 00:19:50.905 }, 00:19:50.905 "claimed": true, 00:19:50.905 "claim_type": "exclusive_write", 00:19:50.905 "zoned": false, 00:19:50.905 "supported_io_types": { 00:19:50.905 "read": true, 00:19:50.905 "write": true, 00:19:50.905 "unmap": true, 00:19:50.905 "flush": true, 00:19:50.905 "reset": true, 00:19:50.905 "nvme_admin": false, 00:19:50.905 "nvme_io": false, 00:19:50.905 "nvme_io_md": false, 00:19:50.905 "write_zeroes": true, 00:19:50.905 "zcopy": true, 00:19:50.905 "get_zone_info": false, 00:19:50.905 "zone_management": false, 00:19:50.905 "zone_append": false, 00:19:50.905 "compare": false, 00:19:50.905 "compare_and_write": false, 00:19:50.905 "abort": true, 00:19:50.905 "seek_hole": false, 00:19:50.905 "seek_data": false, 00:19:50.905 "copy": true, 00:19:50.905 "nvme_iov_md": false 00:19:50.905 }, 00:19:50.905 "memory_domains": [ 00:19:50.905 { 00:19:50.905 "dma_device_id": "system", 00:19:50.905 "dma_device_type": 1 00:19:50.905 }, 00:19:50.905 { 00:19:50.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.905 "dma_device_type": 2 00:19:50.905 } 00:19:50.905 ], 00:19:50.905 "driver_specific": {} 00:19:50.905 } 00:19:50.905 ] 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.164 09:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.423 09:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.423 "name": "Existed_Raid", 00:19:51.423 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:51.423 "strip_size_kb": 64, 00:19:51.423 "state": "configuring", 00:19:51.423 "raid_level": "concat", 00:19:51.423 "superblock": true, 00:19:51.423 "num_base_bdevs": 4, 00:19:51.423 "num_base_bdevs_discovered": 2, 00:19:51.423 "num_base_bdevs_operational": 4, 00:19:51.423 "base_bdevs_list": [ 00:19:51.423 { 00:19:51.423 "name": "BaseBdev1", 00:19:51.423 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:51.423 "is_configured": true, 00:19:51.423 "data_offset": 2048, 00:19:51.423 "data_size": 63488 00:19:51.423 }, 00:19:51.423 { 00:19:51.423 "name": "BaseBdev2", 00:19:51.423 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:51.423 "is_configured": true, 00:19:51.423 "data_offset": 2048, 00:19:51.423 "data_size": 63488 00:19:51.423 }, 00:19:51.423 { 00:19:51.423 "name": "BaseBdev3", 00:19:51.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.423 "is_configured": false, 00:19:51.423 "data_offset": 0, 00:19:51.423 "data_size": 0 00:19:51.423 }, 00:19:51.423 { 00:19:51.423 "name": "BaseBdev4", 00:19:51.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.423 "is_configured": false, 00:19:51.423 "data_offset": 0, 00:19:51.423 "data_size": 0 00:19:51.423 } 00:19:51.423 ] 00:19:51.423 }' 00:19:51.423 09:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.423 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:51.989 09:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:52.247 [2024-07-15 09:24:00.953147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:52.247 BaseBdev3 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:52.247 09:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:52.506 09:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:52.507 [ 00:19:52.507 { 00:19:52.507 "name": "BaseBdev3", 00:19:52.507 "aliases": [ 00:19:52.507 "84403b91-699c-4183-801a-0165933d7baf" 00:19:52.507 ], 00:19:52.507 "product_name": "Malloc disk", 00:19:52.507 "block_size": 512, 00:19:52.507 "num_blocks": 65536, 00:19:52.507 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:52.507 "assigned_rate_limits": { 00:19:52.507 "rw_ios_per_sec": 0, 00:19:52.507 "rw_mbytes_per_sec": 0, 00:19:52.507 "r_mbytes_per_sec": 0, 00:19:52.507 "w_mbytes_per_sec": 0 00:19:52.507 }, 00:19:52.507 "claimed": true, 00:19:52.507 "claim_type": "exclusive_write", 00:19:52.507 "zoned": false, 00:19:52.507 "supported_io_types": { 00:19:52.507 "read": true, 00:19:52.507 "write": true, 00:19:52.507 "unmap": true, 00:19:52.507 "flush": true, 00:19:52.507 "reset": true, 00:19:52.507 "nvme_admin": false, 00:19:52.507 "nvme_io": false, 00:19:52.507 "nvme_io_md": false, 00:19:52.507 "write_zeroes": true, 00:19:52.507 "zcopy": true, 00:19:52.507 "get_zone_info": false, 00:19:52.507 "zone_management": false, 00:19:52.507 "zone_append": false, 00:19:52.507 "compare": false, 00:19:52.507 "compare_and_write": false, 00:19:52.507 "abort": true, 00:19:52.507 "seek_hole": false, 00:19:52.507 "seek_data": false, 00:19:52.507 "copy": true, 00:19:52.507 "nvme_iov_md": false 00:19:52.507 }, 00:19:52.507 "memory_domains": [ 00:19:52.507 { 00:19:52.507 "dma_device_id": "system", 00:19:52.507 "dma_device_type": 1 00:19:52.507 }, 00:19:52.507 { 00:19:52.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.507 "dma_device_type": 2 00:19:52.507 } 00:19:52.507 ], 00:19:52.507 "driver_specific": {} 00:19:52.507 } 00:19:52.507 ] 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.507 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.765 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.765 "name": "Existed_Raid", 00:19:52.765 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:52.765 "strip_size_kb": 64, 00:19:52.765 "state": "configuring", 00:19:52.765 "raid_level": "concat", 00:19:52.765 "superblock": true, 00:19:52.765 "num_base_bdevs": 4, 00:19:52.765 "num_base_bdevs_discovered": 3, 00:19:52.765 "num_base_bdevs_operational": 4, 00:19:52.765 "base_bdevs_list": [ 00:19:52.765 { 00:19:52.765 "name": "BaseBdev1", 00:19:52.765 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:52.765 "is_configured": true, 00:19:52.765 "data_offset": 2048, 00:19:52.765 "data_size": 63488 00:19:52.765 }, 00:19:52.765 { 00:19:52.765 "name": "BaseBdev2", 00:19:52.765 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:52.765 "is_configured": true, 00:19:52.765 "data_offset": 2048, 00:19:52.765 "data_size": 63488 00:19:52.765 }, 00:19:52.765 { 00:19:52.765 "name": "BaseBdev3", 00:19:52.765 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:52.765 "is_configured": true, 00:19:52.765 "data_offset": 2048, 00:19:52.765 "data_size": 63488 00:19:52.765 }, 00:19:52.765 { 00:19:52.765 "name": "BaseBdev4", 00:19:52.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.765 "is_configured": false, 00:19:52.765 "data_offset": 0, 00:19:52.766 "data_size": 0 00:19:52.766 } 00:19:52.766 ] 00:19:52.766 }' 00:19:52.766 09:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.766 09:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:53.341 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:53.599 [2024-07-15 09:24:02.444468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:53.599 [2024-07-15 09:24:02.444637] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcc8350 00:19:53.599 [2024-07-15 09:24:02.444652] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:53.599 [2024-07-15 09:24:02.444825] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc8020 00:19:53.599 [2024-07-15 09:24:02.444953] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcc8350 00:19:53.599 [2024-07-15 09:24:02.444964] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcc8350 00:19:53.599 [2024-07-15 09:24:02.445055] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.599 BaseBdev4 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:53.599 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:53.858 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:54.116 [ 00:19:54.116 { 00:19:54.116 "name": "BaseBdev4", 00:19:54.116 "aliases": [ 00:19:54.116 "37d35526-b0df-49d1-89b3-97499cbc7a6f" 00:19:54.116 ], 00:19:54.116 "product_name": "Malloc disk", 00:19:54.116 "block_size": 512, 00:19:54.116 "num_blocks": 65536, 00:19:54.116 "uuid": "37d35526-b0df-49d1-89b3-97499cbc7a6f", 00:19:54.116 "assigned_rate_limits": { 00:19:54.116 "rw_ios_per_sec": 0, 00:19:54.116 "rw_mbytes_per_sec": 0, 00:19:54.116 "r_mbytes_per_sec": 0, 00:19:54.116 "w_mbytes_per_sec": 0 00:19:54.116 }, 00:19:54.116 "claimed": true, 00:19:54.116 "claim_type": "exclusive_write", 00:19:54.116 "zoned": false, 00:19:54.116 "supported_io_types": { 00:19:54.116 "read": true, 00:19:54.116 "write": true, 00:19:54.116 "unmap": true, 00:19:54.116 "flush": true, 00:19:54.116 "reset": true, 00:19:54.116 "nvme_admin": false, 00:19:54.116 "nvme_io": false, 00:19:54.116 "nvme_io_md": false, 00:19:54.116 "write_zeroes": true, 00:19:54.116 "zcopy": true, 00:19:54.116 "get_zone_info": false, 00:19:54.116 "zone_management": false, 00:19:54.116 "zone_append": false, 00:19:54.116 "compare": false, 00:19:54.116 "compare_and_write": false, 00:19:54.116 "abort": true, 00:19:54.116 "seek_hole": false, 00:19:54.116 "seek_data": false, 00:19:54.116 "copy": true, 00:19:54.116 "nvme_iov_md": false 00:19:54.116 }, 00:19:54.116 "memory_domains": [ 00:19:54.116 { 00:19:54.116 "dma_device_id": "system", 00:19:54.116 "dma_device_type": 1 00:19:54.116 }, 00:19:54.116 { 00:19:54.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.116 "dma_device_type": 2 00:19:54.116 } 00:19:54.116 ], 00:19:54.116 "driver_specific": {} 00:19:54.116 } 00:19:54.116 ] 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.116 09:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:54.409 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.409 "name": "Existed_Raid", 00:19:54.409 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:54.409 "strip_size_kb": 64, 00:19:54.409 "state": "online", 00:19:54.409 "raid_level": "concat", 00:19:54.409 "superblock": true, 00:19:54.409 "num_base_bdevs": 4, 00:19:54.409 "num_base_bdevs_discovered": 4, 00:19:54.409 "num_base_bdevs_operational": 4, 00:19:54.409 "base_bdevs_list": [ 00:19:54.409 { 00:19:54.409 "name": "BaseBdev1", 00:19:54.409 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:54.409 "is_configured": true, 00:19:54.409 "data_offset": 2048, 00:19:54.409 "data_size": 63488 00:19:54.409 }, 00:19:54.409 { 00:19:54.409 "name": "BaseBdev2", 00:19:54.409 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:54.409 "is_configured": true, 00:19:54.409 "data_offset": 2048, 00:19:54.409 "data_size": 63488 00:19:54.409 }, 00:19:54.409 { 00:19:54.409 "name": "BaseBdev3", 00:19:54.409 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:54.409 "is_configured": true, 00:19:54.409 "data_offset": 2048, 00:19:54.409 "data_size": 63488 00:19:54.409 }, 00:19:54.409 { 00:19:54.409 "name": "BaseBdev4", 00:19:54.409 "uuid": "37d35526-b0df-49d1-89b3-97499cbc7a6f", 00:19:54.409 "is_configured": true, 00:19:54.409 "data_offset": 2048, 00:19:54.409 "data_size": 63488 00:19:54.409 } 00:19:54.409 ] 00:19:54.409 }' 00:19:54.409 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.409 09:24:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:54.981 09:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:55.240 [2024-07-15 09:24:04.073144] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:55.240 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:55.240 "name": "Existed_Raid", 00:19:55.240 "aliases": [ 00:19:55.240 "aaba37db-3ea2-40fc-87bd-af24a1415be6" 00:19:55.240 ], 00:19:55.240 "product_name": "Raid Volume", 00:19:55.240 "block_size": 512, 00:19:55.240 "num_blocks": 253952, 00:19:55.240 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:55.240 "assigned_rate_limits": { 00:19:55.240 "rw_ios_per_sec": 0, 00:19:55.240 "rw_mbytes_per_sec": 0, 00:19:55.240 "r_mbytes_per_sec": 0, 00:19:55.240 "w_mbytes_per_sec": 0 00:19:55.240 }, 00:19:55.240 "claimed": false, 00:19:55.240 "zoned": false, 00:19:55.240 "supported_io_types": { 00:19:55.240 "read": true, 00:19:55.240 "write": true, 00:19:55.240 "unmap": true, 00:19:55.240 "flush": true, 00:19:55.240 "reset": true, 00:19:55.240 "nvme_admin": false, 00:19:55.240 "nvme_io": false, 00:19:55.240 "nvme_io_md": false, 00:19:55.240 "write_zeroes": true, 00:19:55.240 "zcopy": false, 00:19:55.240 "get_zone_info": false, 00:19:55.240 "zone_management": false, 00:19:55.240 "zone_append": false, 00:19:55.240 "compare": false, 00:19:55.240 "compare_and_write": false, 00:19:55.240 "abort": false, 00:19:55.240 "seek_hole": false, 00:19:55.240 "seek_data": false, 00:19:55.240 "copy": false, 00:19:55.240 "nvme_iov_md": false 00:19:55.240 }, 00:19:55.240 "memory_domains": [ 00:19:55.240 { 00:19:55.240 "dma_device_id": "system", 00:19:55.240 "dma_device_type": 1 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.240 "dma_device_type": 2 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "system", 00:19:55.240 "dma_device_type": 1 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.240 "dma_device_type": 2 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "system", 00:19:55.240 "dma_device_type": 1 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.240 "dma_device_type": 2 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "system", 00:19:55.240 "dma_device_type": 1 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.240 "dma_device_type": 2 00:19:55.240 } 00:19:55.240 ], 00:19:55.240 "driver_specific": { 00:19:55.240 "raid": { 00:19:55.240 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:55.240 "strip_size_kb": 64, 00:19:55.240 "state": "online", 00:19:55.240 "raid_level": "concat", 00:19:55.240 "superblock": true, 00:19:55.240 "num_base_bdevs": 4, 00:19:55.240 "num_base_bdevs_discovered": 4, 00:19:55.240 "num_base_bdevs_operational": 4, 00:19:55.240 "base_bdevs_list": [ 00:19:55.240 { 00:19:55.240 "name": "BaseBdev1", 00:19:55.240 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:55.240 "is_configured": true, 00:19:55.240 "data_offset": 2048, 00:19:55.240 "data_size": 63488 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "name": "BaseBdev2", 00:19:55.240 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:55.240 "is_configured": true, 00:19:55.240 "data_offset": 2048, 00:19:55.240 "data_size": 63488 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "name": "BaseBdev3", 00:19:55.240 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:55.240 "is_configured": true, 00:19:55.240 "data_offset": 2048, 00:19:55.240 "data_size": 63488 00:19:55.240 }, 00:19:55.240 { 00:19:55.240 "name": "BaseBdev4", 00:19:55.240 "uuid": "37d35526-b0df-49d1-89b3-97499cbc7a6f", 00:19:55.240 "is_configured": true, 00:19:55.240 "data_offset": 2048, 00:19:55.240 "data_size": 63488 00:19:55.240 } 00:19:55.240 ] 00:19:55.240 } 00:19:55.240 } 00:19:55.240 }' 00:19:55.240 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:55.240 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:55.240 BaseBdev2 00:19:55.241 BaseBdev3 00:19:55.241 BaseBdev4' 00:19:55.241 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.241 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:55.241 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.499 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.499 "name": "BaseBdev1", 00:19:55.499 "aliases": [ 00:19:55.499 "36d4b5dd-1c23-42ca-904f-294a639d63d9" 00:19:55.499 ], 00:19:55.499 "product_name": "Malloc disk", 00:19:55.499 "block_size": 512, 00:19:55.499 "num_blocks": 65536, 00:19:55.499 "uuid": "36d4b5dd-1c23-42ca-904f-294a639d63d9", 00:19:55.499 "assigned_rate_limits": { 00:19:55.499 "rw_ios_per_sec": 0, 00:19:55.499 "rw_mbytes_per_sec": 0, 00:19:55.499 "r_mbytes_per_sec": 0, 00:19:55.499 "w_mbytes_per_sec": 0 00:19:55.499 }, 00:19:55.499 "claimed": true, 00:19:55.499 "claim_type": "exclusive_write", 00:19:55.499 "zoned": false, 00:19:55.499 "supported_io_types": { 00:19:55.499 "read": true, 00:19:55.499 "write": true, 00:19:55.499 "unmap": true, 00:19:55.499 "flush": true, 00:19:55.499 "reset": true, 00:19:55.499 "nvme_admin": false, 00:19:55.499 "nvme_io": false, 00:19:55.499 "nvme_io_md": false, 00:19:55.499 "write_zeroes": true, 00:19:55.499 "zcopy": true, 00:19:55.499 "get_zone_info": false, 00:19:55.499 "zone_management": false, 00:19:55.499 "zone_append": false, 00:19:55.499 "compare": false, 00:19:55.499 "compare_and_write": false, 00:19:55.499 "abort": true, 00:19:55.499 "seek_hole": false, 00:19:55.499 "seek_data": false, 00:19:55.499 "copy": true, 00:19:55.499 "nvme_iov_md": false 00:19:55.499 }, 00:19:55.499 "memory_domains": [ 00:19:55.499 { 00:19:55.499 "dma_device_id": "system", 00:19:55.499 "dma_device_type": 1 00:19:55.499 }, 00:19:55.499 { 00:19:55.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.499 "dma_device_type": 2 00:19:55.499 } 00:19:55.499 ], 00:19:55.499 "driver_specific": {} 00:19:55.499 }' 00:19:55.499 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.499 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.499 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.500 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:55.758 09:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.326 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.326 "name": "BaseBdev2", 00:19:56.326 "aliases": [ 00:19:56.326 "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400" 00:19:56.326 ], 00:19:56.326 "product_name": "Malloc disk", 00:19:56.326 "block_size": 512, 00:19:56.326 "num_blocks": 65536, 00:19:56.326 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:56.326 "assigned_rate_limits": { 00:19:56.326 "rw_ios_per_sec": 0, 00:19:56.326 "rw_mbytes_per_sec": 0, 00:19:56.326 "r_mbytes_per_sec": 0, 00:19:56.326 "w_mbytes_per_sec": 0 00:19:56.326 }, 00:19:56.326 "claimed": true, 00:19:56.326 "claim_type": "exclusive_write", 00:19:56.326 "zoned": false, 00:19:56.326 "supported_io_types": { 00:19:56.326 "read": true, 00:19:56.326 "write": true, 00:19:56.326 "unmap": true, 00:19:56.326 "flush": true, 00:19:56.326 "reset": true, 00:19:56.326 "nvme_admin": false, 00:19:56.326 "nvme_io": false, 00:19:56.326 "nvme_io_md": false, 00:19:56.326 "write_zeroes": true, 00:19:56.326 "zcopy": true, 00:19:56.326 "get_zone_info": false, 00:19:56.326 "zone_management": false, 00:19:56.326 "zone_append": false, 00:19:56.326 "compare": false, 00:19:56.326 "compare_and_write": false, 00:19:56.326 "abort": true, 00:19:56.326 "seek_hole": false, 00:19:56.326 "seek_data": false, 00:19:56.326 "copy": true, 00:19:56.326 "nvme_iov_md": false 00:19:56.326 }, 00:19:56.326 "memory_domains": [ 00:19:56.326 { 00:19:56.326 "dma_device_id": "system", 00:19:56.326 "dma_device_type": 1 00:19:56.326 }, 00:19:56.326 { 00:19:56.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.326 "dma_device_type": 2 00:19:56.326 } 00:19:56.326 ], 00:19:56.326 "driver_specific": {} 00:19:56.326 }' 00:19:56.326 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.326 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.584 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:56.585 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.844 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.844 "name": "BaseBdev3", 00:19:56.844 "aliases": [ 00:19:56.844 "84403b91-699c-4183-801a-0165933d7baf" 00:19:56.844 ], 00:19:56.844 "product_name": "Malloc disk", 00:19:56.844 "block_size": 512, 00:19:56.844 "num_blocks": 65536, 00:19:56.844 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:56.844 "assigned_rate_limits": { 00:19:56.844 "rw_ios_per_sec": 0, 00:19:56.844 "rw_mbytes_per_sec": 0, 00:19:56.844 "r_mbytes_per_sec": 0, 00:19:56.844 "w_mbytes_per_sec": 0 00:19:56.844 }, 00:19:56.844 "claimed": true, 00:19:56.844 "claim_type": "exclusive_write", 00:19:56.844 "zoned": false, 00:19:56.844 "supported_io_types": { 00:19:56.844 "read": true, 00:19:56.844 "write": true, 00:19:56.844 "unmap": true, 00:19:56.844 "flush": true, 00:19:56.844 "reset": true, 00:19:56.844 "nvme_admin": false, 00:19:56.844 "nvme_io": false, 00:19:56.844 "nvme_io_md": false, 00:19:56.844 "write_zeroes": true, 00:19:56.844 "zcopy": true, 00:19:56.844 "get_zone_info": false, 00:19:56.844 "zone_management": false, 00:19:56.844 "zone_append": false, 00:19:56.844 "compare": false, 00:19:56.844 "compare_and_write": false, 00:19:56.844 "abort": true, 00:19:56.844 "seek_hole": false, 00:19:56.844 "seek_data": false, 00:19:56.844 "copy": true, 00:19:56.844 "nvme_iov_md": false 00:19:56.844 }, 00:19:56.844 "memory_domains": [ 00:19:56.844 { 00:19:56.844 "dma_device_id": "system", 00:19:56.844 "dma_device_type": 1 00:19:56.844 }, 00:19:56.844 { 00:19:56.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.844 "dma_device_type": 2 00:19:56.844 } 00:19:56.844 ], 00:19:56.844 "driver_specific": {} 00:19:56.844 }' 00:19:56.844 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.844 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.844 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.844 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.103 09:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.103 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.103 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.103 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.103 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:57.361 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.361 "name": "BaseBdev4", 00:19:57.361 "aliases": [ 00:19:57.361 "37d35526-b0df-49d1-89b3-97499cbc7a6f" 00:19:57.361 ], 00:19:57.361 "product_name": "Malloc disk", 00:19:57.361 "block_size": 512, 00:19:57.361 "num_blocks": 65536, 00:19:57.361 "uuid": "37d35526-b0df-49d1-89b3-97499cbc7a6f", 00:19:57.361 "assigned_rate_limits": { 00:19:57.361 "rw_ios_per_sec": 0, 00:19:57.361 "rw_mbytes_per_sec": 0, 00:19:57.361 "r_mbytes_per_sec": 0, 00:19:57.361 "w_mbytes_per_sec": 0 00:19:57.361 }, 00:19:57.361 "claimed": true, 00:19:57.361 "claim_type": "exclusive_write", 00:19:57.361 "zoned": false, 00:19:57.361 "supported_io_types": { 00:19:57.361 "read": true, 00:19:57.361 "write": true, 00:19:57.361 "unmap": true, 00:19:57.361 "flush": true, 00:19:57.361 "reset": true, 00:19:57.361 "nvme_admin": false, 00:19:57.362 "nvme_io": false, 00:19:57.362 "nvme_io_md": false, 00:19:57.362 "write_zeroes": true, 00:19:57.362 "zcopy": true, 00:19:57.362 "get_zone_info": false, 00:19:57.362 "zone_management": false, 00:19:57.362 "zone_append": false, 00:19:57.362 "compare": false, 00:19:57.362 "compare_and_write": false, 00:19:57.362 "abort": true, 00:19:57.362 "seek_hole": false, 00:19:57.362 "seek_data": false, 00:19:57.362 "copy": true, 00:19:57.362 "nvme_iov_md": false 00:19:57.362 }, 00:19:57.362 "memory_domains": [ 00:19:57.362 { 00:19:57.362 "dma_device_id": "system", 00:19:57.362 "dma_device_type": 1 00:19:57.362 }, 00:19:57.362 { 00:19:57.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.362 "dma_device_type": 2 00:19:57.362 } 00:19:57.362 ], 00:19:57.362 "driver_specific": {} 00:19:57.362 }' 00:19:57.362 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.620 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.878 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.878 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.878 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:58.136 [2024-07-15 09:24:06.912375] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:58.136 [2024-07-15 09:24:06.912403] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:58.136 [2024-07-15 09:24:06.912449] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.136 09:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.395 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.395 "name": "Existed_Raid", 00:19:58.395 "uuid": "aaba37db-3ea2-40fc-87bd-af24a1415be6", 00:19:58.395 "strip_size_kb": 64, 00:19:58.395 "state": "offline", 00:19:58.395 "raid_level": "concat", 00:19:58.395 "superblock": true, 00:19:58.395 "num_base_bdevs": 4, 00:19:58.395 "num_base_bdevs_discovered": 3, 00:19:58.395 "num_base_bdevs_operational": 3, 00:19:58.395 "base_bdevs_list": [ 00:19:58.395 { 00:19:58.395 "name": null, 00:19:58.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.395 "is_configured": false, 00:19:58.395 "data_offset": 2048, 00:19:58.395 "data_size": 63488 00:19:58.395 }, 00:19:58.395 { 00:19:58.395 "name": "BaseBdev2", 00:19:58.395 "uuid": "2ffcd6ae-5ecb-4263-9de0-73dcbeb5c400", 00:19:58.395 "is_configured": true, 00:19:58.395 "data_offset": 2048, 00:19:58.395 "data_size": 63488 00:19:58.395 }, 00:19:58.395 { 00:19:58.395 "name": "BaseBdev3", 00:19:58.395 "uuid": "84403b91-699c-4183-801a-0165933d7baf", 00:19:58.395 "is_configured": true, 00:19:58.395 "data_offset": 2048, 00:19:58.395 "data_size": 63488 00:19:58.395 }, 00:19:58.395 { 00:19:58.395 "name": "BaseBdev4", 00:19:58.395 "uuid": "37d35526-b0df-49d1-89b3-97499cbc7a6f", 00:19:58.395 "is_configured": true, 00:19:58.395 "data_offset": 2048, 00:19:58.395 "data_size": 63488 00:19:58.395 } 00:19:58.395 ] 00:19:58.395 }' 00:19:58.395 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.395 09:24:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:58.961 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:58.961 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:58.961 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.961 09:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:59.220 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:59.220 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:59.220 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:59.478 [2024-07-15 09:24:08.256960] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:59.478 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:59.478 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:59.478 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.478 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:59.735 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:59.735 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:59.735 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:59.993 [2024-07-15 09:24:08.762728] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:59.993 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:59.993 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:59.993 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.993 09:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:00.251 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:00.251 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:00.251 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:00.509 [2024-07-15 09:24:09.264181] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:00.509 [2024-07-15 09:24:09.264225] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc8350 name Existed_Raid, state offline 00:20:00.509 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:00.509 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:00.509 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.509 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:00.767 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:01.025 BaseBdev2 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.025 09:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.283 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:01.542 [ 00:20:01.542 { 00:20:01.542 "name": "BaseBdev2", 00:20:01.542 "aliases": [ 00:20:01.542 "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134" 00:20:01.542 ], 00:20:01.542 "product_name": "Malloc disk", 00:20:01.542 "block_size": 512, 00:20:01.542 "num_blocks": 65536, 00:20:01.542 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:01.542 "assigned_rate_limits": { 00:20:01.542 "rw_ios_per_sec": 0, 00:20:01.542 "rw_mbytes_per_sec": 0, 00:20:01.542 "r_mbytes_per_sec": 0, 00:20:01.542 "w_mbytes_per_sec": 0 00:20:01.542 }, 00:20:01.542 "claimed": false, 00:20:01.542 "zoned": false, 00:20:01.542 "supported_io_types": { 00:20:01.542 "read": true, 00:20:01.542 "write": true, 00:20:01.542 "unmap": true, 00:20:01.542 "flush": true, 00:20:01.542 "reset": true, 00:20:01.542 "nvme_admin": false, 00:20:01.542 "nvme_io": false, 00:20:01.542 "nvme_io_md": false, 00:20:01.542 "write_zeroes": true, 00:20:01.542 "zcopy": true, 00:20:01.542 "get_zone_info": false, 00:20:01.542 "zone_management": false, 00:20:01.542 "zone_append": false, 00:20:01.542 "compare": false, 00:20:01.542 "compare_and_write": false, 00:20:01.542 "abort": true, 00:20:01.542 "seek_hole": false, 00:20:01.542 "seek_data": false, 00:20:01.542 "copy": true, 00:20:01.542 "nvme_iov_md": false 00:20:01.542 }, 00:20:01.542 "memory_domains": [ 00:20:01.542 { 00:20:01.542 "dma_device_id": "system", 00:20:01.542 "dma_device_type": 1 00:20:01.542 }, 00:20:01.542 { 00:20:01.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.542 "dma_device_type": 2 00:20:01.542 } 00:20:01.542 ], 00:20:01.542 "driver_specific": {} 00:20:01.542 } 00:20:01.542 ] 00:20:01.542 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:01.542 09:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:01.542 09:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.542 09:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:01.800 BaseBdev3 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.800 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:02.058 09:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:02.315 [ 00:20:02.315 { 00:20:02.315 "name": "BaseBdev3", 00:20:02.315 "aliases": [ 00:20:02.315 "cab3ee15-2c80-4ba8-bcbc-34cf335587e1" 00:20:02.315 ], 00:20:02.315 "product_name": "Malloc disk", 00:20:02.315 "block_size": 512, 00:20:02.315 "num_blocks": 65536, 00:20:02.315 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:02.315 "assigned_rate_limits": { 00:20:02.315 "rw_ios_per_sec": 0, 00:20:02.315 "rw_mbytes_per_sec": 0, 00:20:02.315 "r_mbytes_per_sec": 0, 00:20:02.315 "w_mbytes_per_sec": 0 00:20:02.315 }, 00:20:02.315 "claimed": false, 00:20:02.315 "zoned": false, 00:20:02.315 "supported_io_types": { 00:20:02.315 "read": true, 00:20:02.315 "write": true, 00:20:02.315 "unmap": true, 00:20:02.315 "flush": true, 00:20:02.315 "reset": true, 00:20:02.315 "nvme_admin": false, 00:20:02.315 "nvme_io": false, 00:20:02.315 "nvme_io_md": false, 00:20:02.315 "write_zeroes": true, 00:20:02.315 "zcopy": true, 00:20:02.315 "get_zone_info": false, 00:20:02.315 "zone_management": false, 00:20:02.315 "zone_append": false, 00:20:02.315 "compare": false, 00:20:02.315 "compare_and_write": false, 00:20:02.315 "abort": true, 00:20:02.315 "seek_hole": false, 00:20:02.315 "seek_data": false, 00:20:02.315 "copy": true, 00:20:02.315 "nvme_iov_md": false 00:20:02.315 }, 00:20:02.315 "memory_domains": [ 00:20:02.315 { 00:20:02.315 "dma_device_id": "system", 00:20:02.315 "dma_device_type": 1 00:20:02.315 }, 00:20:02.315 { 00:20:02.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.316 "dma_device_type": 2 00:20:02.316 } 00:20:02.316 ], 00:20:02.316 "driver_specific": {} 00:20:02.316 } 00:20:02.316 ] 00:20:02.316 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:02.316 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:02.316 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:02.316 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:02.574 BaseBdev4 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.574 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:02.833 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:02.833 [ 00:20:02.833 { 00:20:02.833 "name": "BaseBdev4", 00:20:02.833 "aliases": [ 00:20:02.833 "e885a51a-0101-41fa-8521-a24c0cad4d7e" 00:20:02.833 ], 00:20:02.833 "product_name": "Malloc disk", 00:20:02.833 "block_size": 512, 00:20:02.833 "num_blocks": 65536, 00:20:02.833 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:02.833 "assigned_rate_limits": { 00:20:02.833 "rw_ios_per_sec": 0, 00:20:02.833 "rw_mbytes_per_sec": 0, 00:20:02.833 "r_mbytes_per_sec": 0, 00:20:02.833 "w_mbytes_per_sec": 0 00:20:02.833 }, 00:20:02.833 "claimed": false, 00:20:02.833 "zoned": false, 00:20:02.833 "supported_io_types": { 00:20:02.833 "read": true, 00:20:02.833 "write": true, 00:20:02.833 "unmap": true, 00:20:02.833 "flush": true, 00:20:02.833 "reset": true, 00:20:02.833 "nvme_admin": false, 00:20:02.833 "nvme_io": false, 00:20:02.833 "nvme_io_md": false, 00:20:02.833 "write_zeroes": true, 00:20:02.833 "zcopy": true, 00:20:02.833 "get_zone_info": false, 00:20:02.833 "zone_management": false, 00:20:02.833 "zone_append": false, 00:20:02.833 "compare": false, 00:20:02.833 "compare_and_write": false, 00:20:02.833 "abort": true, 00:20:02.833 "seek_hole": false, 00:20:02.833 "seek_data": false, 00:20:02.833 "copy": true, 00:20:02.833 "nvme_iov_md": false 00:20:02.833 }, 00:20:02.833 "memory_domains": [ 00:20:02.834 { 00:20:02.834 "dma_device_id": "system", 00:20:02.834 "dma_device_type": 1 00:20:02.834 }, 00:20:02.834 { 00:20:02.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.834 "dma_device_type": 2 00:20:02.834 } 00:20:02.834 ], 00:20:02.834 "driver_specific": {} 00:20:02.834 } 00:20:02.834 ] 00:20:02.834 09:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:02.834 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:02.834 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:02.834 09:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:03.092 [2024-07-15 09:24:11.997047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:03.093 [2024-07-15 09:24:11.997089] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:03.093 [2024-07-15 09:24:11.997109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:03.093 [2024-07-15 09:24:11.998475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:03.093 [2024-07-15 09:24:11.998517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.093 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.351 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.351 "name": "Existed_Raid", 00:20:03.351 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:03.351 "strip_size_kb": 64, 00:20:03.351 "state": "configuring", 00:20:03.351 "raid_level": "concat", 00:20:03.351 "superblock": true, 00:20:03.351 "num_base_bdevs": 4, 00:20:03.351 "num_base_bdevs_discovered": 3, 00:20:03.351 "num_base_bdevs_operational": 4, 00:20:03.351 "base_bdevs_list": [ 00:20:03.351 { 00:20:03.351 "name": "BaseBdev1", 00:20:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.351 "is_configured": false, 00:20:03.351 "data_offset": 0, 00:20:03.351 "data_size": 0 00:20:03.351 }, 00:20:03.351 { 00:20:03.351 "name": "BaseBdev2", 00:20:03.351 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:03.351 "is_configured": true, 00:20:03.351 "data_offset": 2048, 00:20:03.351 "data_size": 63488 00:20:03.351 }, 00:20:03.351 { 00:20:03.351 "name": "BaseBdev3", 00:20:03.351 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:03.351 "is_configured": true, 00:20:03.351 "data_offset": 2048, 00:20:03.351 "data_size": 63488 00:20:03.351 }, 00:20:03.351 { 00:20:03.351 "name": "BaseBdev4", 00:20:03.351 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:03.351 "is_configured": true, 00:20:03.351 "data_offset": 2048, 00:20:03.351 "data_size": 63488 00:20:03.351 } 00:20:03.351 ] 00:20:03.351 }' 00:20:03.351 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.351 09:24:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.919 09:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:04.178 [2024-07-15 09:24:13.055866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.178 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.437 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.437 "name": "Existed_Raid", 00:20:04.437 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:04.437 "strip_size_kb": 64, 00:20:04.437 "state": "configuring", 00:20:04.437 "raid_level": "concat", 00:20:04.437 "superblock": true, 00:20:04.437 "num_base_bdevs": 4, 00:20:04.437 "num_base_bdevs_discovered": 2, 00:20:04.437 "num_base_bdevs_operational": 4, 00:20:04.437 "base_bdevs_list": [ 00:20:04.437 { 00:20:04.437 "name": "BaseBdev1", 00:20:04.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.437 "is_configured": false, 00:20:04.437 "data_offset": 0, 00:20:04.437 "data_size": 0 00:20:04.437 }, 00:20:04.437 { 00:20:04.437 "name": null, 00:20:04.437 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:04.437 "is_configured": false, 00:20:04.437 "data_offset": 2048, 00:20:04.437 "data_size": 63488 00:20:04.437 }, 00:20:04.437 { 00:20:04.437 "name": "BaseBdev3", 00:20:04.437 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:04.437 "is_configured": true, 00:20:04.437 "data_offset": 2048, 00:20:04.437 "data_size": 63488 00:20:04.437 }, 00:20:04.437 { 00:20:04.437 "name": "BaseBdev4", 00:20:04.437 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:04.437 "is_configured": true, 00:20:04.437 "data_offset": 2048, 00:20:04.437 "data_size": 63488 00:20:04.437 } 00:20:04.437 ] 00:20:04.437 }' 00:20:04.437 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.437 09:24:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.009 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.009 09:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:05.268 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:05.268 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:05.527 [2024-07-15 09:24:14.363966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:05.527 BaseBdev1 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:05.527 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:05.786 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:06.045 [ 00:20:06.045 { 00:20:06.045 "name": "BaseBdev1", 00:20:06.045 "aliases": [ 00:20:06.045 "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c" 00:20:06.045 ], 00:20:06.045 "product_name": "Malloc disk", 00:20:06.045 "block_size": 512, 00:20:06.045 "num_blocks": 65536, 00:20:06.045 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:06.045 "assigned_rate_limits": { 00:20:06.045 "rw_ios_per_sec": 0, 00:20:06.045 "rw_mbytes_per_sec": 0, 00:20:06.045 "r_mbytes_per_sec": 0, 00:20:06.045 "w_mbytes_per_sec": 0 00:20:06.045 }, 00:20:06.045 "claimed": true, 00:20:06.045 "claim_type": "exclusive_write", 00:20:06.045 "zoned": false, 00:20:06.045 "supported_io_types": { 00:20:06.045 "read": true, 00:20:06.045 "write": true, 00:20:06.045 "unmap": true, 00:20:06.045 "flush": true, 00:20:06.045 "reset": true, 00:20:06.045 "nvme_admin": false, 00:20:06.045 "nvme_io": false, 00:20:06.045 "nvme_io_md": false, 00:20:06.045 "write_zeroes": true, 00:20:06.045 "zcopy": true, 00:20:06.045 "get_zone_info": false, 00:20:06.045 "zone_management": false, 00:20:06.045 "zone_append": false, 00:20:06.045 "compare": false, 00:20:06.045 "compare_and_write": false, 00:20:06.045 "abort": true, 00:20:06.045 "seek_hole": false, 00:20:06.045 "seek_data": false, 00:20:06.045 "copy": true, 00:20:06.045 "nvme_iov_md": false 00:20:06.045 }, 00:20:06.045 "memory_domains": [ 00:20:06.045 { 00:20:06.045 "dma_device_id": "system", 00:20:06.045 "dma_device_type": 1 00:20:06.045 }, 00:20:06.045 { 00:20:06.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.045 "dma_device_type": 2 00:20:06.045 } 00:20:06.045 ], 00:20:06.045 "driver_specific": {} 00:20:06.045 } 00:20:06.045 ] 00:20:06.045 09:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:06.045 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.046 09:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.305 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.305 "name": "Existed_Raid", 00:20:06.305 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:06.305 "strip_size_kb": 64, 00:20:06.305 "state": "configuring", 00:20:06.305 "raid_level": "concat", 00:20:06.305 "superblock": true, 00:20:06.305 "num_base_bdevs": 4, 00:20:06.305 "num_base_bdevs_discovered": 3, 00:20:06.305 "num_base_bdevs_operational": 4, 00:20:06.305 "base_bdevs_list": [ 00:20:06.305 { 00:20:06.305 "name": "BaseBdev1", 00:20:06.305 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:06.305 "is_configured": true, 00:20:06.305 "data_offset": 2048, 00:20:06.305 "data_size": 63488 00:20:06.305 }, 00:20:06.305 { 00:20:06.305 "name": null, 00:20:06.305 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:06.305 "is_configured": false, 00:20:06.305 "data_offset": 2048, 00:20:06.305 "data_size": 63488 00:20:06.305 }, 00:20:06.305 { 00:20:06.305 "name": "BaseBdev3", 00:20:06.305 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:06.305 "is_configured": true, 00:20:06.305 "data_offset": 2048, 00:20:06.305 "data_size": 63488 00:20:06.305 }, 00:20:06.305 { 00:20:06.305 "name": "BaseBdev4", 00:20:06.305 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:06.305 "is_configured": true, 00:20:06.305 "data_offset": 2048, 00:20:06.305 "data_size": 63488 00:20:06.305 } 00:20:06.305 ] 00:20:06.305 }' 00:20:06.305 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.305 09:24:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.872 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:06.872 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.131 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:07.131 09:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:07.390 [2024-07-15 09:24:16.136705] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.390 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.649 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.649 "name": "Existed_Raid", 00:20:07.649 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:07.649 "strip_size_kb": 64, 00:20:07.649 "state": "configuring", 00:20:07.649 "raid_level": "concat", 00:20:07.649 "superblock": true, 00:20:07.649 "num_base_bdevs": 4, 00:20:07.649 "num_base_bdevs_discovered": 2, 00:20:07.649 "num_base_bdevs_operational": 4, 00:20:07.649 "base_bdevs_list": [ 00:20:07.649 { 00:20:07.649 "name": "BaseBdev1", 00:20:07.649 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:07.649 "is_configured": true, 00:20:07.649 "data_offset": 2048, 00:20:07.649 "data_size": 63488 00:20:07.649 }, 00:20:07.649 { 00:20:07.649 "name": null, 00:20:07.649 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:07.649 "is_configured": false, 00:20:07.649 "data_offset": 2048, 00:20:07.649 "data_size": 63488 00:20:07.649 }, 00:20:07.649 { 00:20:07.649 "name": null, 00:20:07.649 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:07.649 "is_configured": false, 00:20:07.649 "data_offset": 2048, 00:20:07.649 "data_size": 63488 00:20:07.649 }, 00:20:07.649 { 00:20:07.649 "name": "BaseBdev4", 00:20:07.649 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:07.649 "is_configured": true, 00:20:07.649 "data_offset": 2048, 00:20:07.649 "data_size": 63488 00:20:07.649 } 00:20:07.649 ] 00:20:07.649 }' 00:20:07.649 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.649 09:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.216 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.216 09:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:08.475 [2024-07-15 09:24:17.400083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.475 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.738 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.738 "name": "Existed_Raid", 00:20:08.738 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:08.738 "strip_size_kb": 64, 00:20:08.738 "state": "configuring", 00:20:08.738 "raid_level": "concat", 00:20:08.738 "superblock": true, 00:20:08.738 "num_base_bdevs": 4, 00:20:08.738 "num_base_bdevs_discovered": 3, 00:20:08.738 "num_base_bdevs_operational": 4, 00:20:08.738 "base_bdevs_list": [ 00:20:08.738 { 00:20:08.738 "name": "BaseBdev1", 00:20:08.738 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:08.738 "is_configured": true, 00:20:08.738 "data_offset": 2048, 00:20:08.738 "data_size": 63488 00:20:08.738 }, 00:20:08.738 { 00:20:08.738 "name": null, 00:20:08.738 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:08.738 "is_configured": false, 00:20:08.738 "data_offset": 2048, 00:20:08.738 "data_size": 63488 00:20:08.738 }, 00:20:08.738 { 00:20:08.738 "name": "BaseBdev3", 00:20:08.738 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:08.738 "is_configured": true, 00:20:08.738 "data_offset": 2048, 00:20:08.738 "data_size": 63488 00:20:08.738 }, 00:20:08.738 { 00:20:08.738 "name": "BaseBdev4", 00:20:08.738 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:08.738 "is_configured": true, 00:20:08.738 "data_offset": 2048, 00:20:08.738 "data_size": 63488 00:20:08.738 } 00:20:08.738 ] 00:20:08.738 }' 00:20:08.738 09:24:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.738 09:24:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:09.673 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.673 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:09.673 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:09.673 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:09.933 [2024-07-15 09:24:18.667462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.933 "name": "Existed_Raid", 00:20:09.933 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:09.933 "strip_size_kb": 64, 00:20:09.933 "state": "configuring", 00:20:09.933 "raid_level": "concat", 00:20:09.933 "superblock": true, 00:20:09.933 "num_base_bdevs": 4, 00:20:09.933 "num_base_bdevs_discovered": 2, 00:20:09.933 "num_base_bdevs_operational": 4, 00:20:09.933 "base_bdevs_list": [ 00:20:09.933 { 00:20:09.933 "name": null, 00:20:09.933 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:09.933 "is_configured": false, 00:20:09.933 "data_offset": 2048, 00:20:09.933 "data_size": 63488 00:20:09.933 }, 00:20:09.933 { 00:20:09.933 "name": null, 00:20:09.933 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:09.933 "is_configured": false, 00:20:09.933 "data_offset": 2048, 00:20:09.933 "data_size": 63488 00:20:09.933 }, 00:20:09.933 { 00:20:09.933 "name": "BaseBdev3", 00:20:09.933 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:09.933 "is_configured": true, 00:20:09.933 "data_offset": 2048, 00:20:09.933 "data_size": 63488 00:20:09.933 }, 00:20:09.933 { 00:20:09.933 "name": "BaseBdev4", 00:20:09.933 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:09.933 "is_configured": true, 00:20:09.933 "data_offset": 2048, 00:20:09.933 "data_size": 63488 00:20:09.933 } 00:20:09.933 ] 00:20:09.933 }' 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.933 09:24:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:10.501 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.501 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:10.760 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:10.760 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:11.020 [2024-07-15 09:24:19.835106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.020 09:24:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.279 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.279 "name": "Existed_Raid", 00:20:11.279 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:11.279 "strip_size_kb": 64, 00:20:11.279 "state": "configuring", 00:20:11.279 "raid_level": "concat", 00:20:11.279 "superblock": true, 00:20:11.279 "num_base_bdevs": 4, 00:20:11.279 "num_base_bdevs_discovered": 3, 00:20:11.279 "num_base_bdevs_operational": 4, 00:20:11.279 "base_bdevs_list": [ 00:20:11.279 { 00:20:11.279 "name": null, 00:20:11.279 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:11.279 "is_configured": false, 00:20:11.279 "data_offset": 2048, 00:20:11.279 "data_size": 63488 00:20:11.279 }, 00:20:11.279 { 00:20:11.279 "name": "BaseBdev2", 00:20:11.279 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:11.279 "is_configured": true, 00:20:11.279 "data_offset": 2048, 00:20:11.279 "data_size": 63488 00:20:11.279 }, 00:20:11.279 { 00:20:11.279 "name": "BaseBdev3", 00:20:11.279 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:11.279 "is_configured": true, 00:20:11.279 "data_offset": 2048, 00:20:11.279 "data_size": 63488 00:20:11.279 }, 00:20:11.279 { 00:20:11.279 "name": "BaseBdev4", 00:20:11.279 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:11.279 "is_configured": true, 00:20:11.279 "data_offset": 2048, 00:20:11.279 "data_size": 63488 00:20:11.279 } 00:20:11.279 ] 00:20:11.279 }' 00:20:11.279 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.279 09:24:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.847 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.847 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:12.105 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:12.105 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.105 09:24:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:12.364 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c 00:20:12.623 [2024-07-15 09:24:21.358702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:12.624 [2024-07-15 09:24:21.358860] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcca850 00:20:12.624 [2024-07-15 09:24:21.358873] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:12.624 [2024-07-15 09:24:21.359060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc0d80 00:20:12.624 [2024-07-15 09:24:21.359178] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcca850 00:20:12.624 [2024-07-15 09:24:21.359188] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcca850 00:20:12.624 [2024-07-15 09:24:21.359278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.624 NewBaseBdev 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:12.624 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:12.882 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:13.141 [ 00:20:13.141 { 00:20:13.141 "name": "NewBaseBdev", 00:20:13.141 "aliases": [ 00:20:13.141 "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c" 00:20:13.141 ], 00:20:13.141 "product_name": "Malloc disk", 00:20:13.141 "block_size": 512, 00:20:13.141 "num_blocks": 65536, 00:20:13.141 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:13.141 "assigned_rate_limits": { 00:20:13.141 "rw_ios_per_sec": 0, 00:20:13.141 "rw_mbytes_per_sec": 0, 00:20:13.141 "r_mbytes_per_sec": 0, 00:20:13.141 "w_mbytes_per_sec": 0 00:20:13.141 }, 00:20:13.141 "claimed": true, 00:20:13.141 "claim_type": "exclusive_write", 00:20:13.141 "zoned": false, 00:20:13.141 "supported_io_types": { 00:20:13.142 "read": true, 00:20:13.142 "write": true, 00:20:13.142 "unmap": true, 00:20:13.142 "flush": true, 00:20:13.142 "reset": true, 00:20:13.142 "nvme_admin": false, 00:20:13.142 "nvme_io": false, 00:20:13.142 "nvme_io_md": false, 00:20:13.142 "write_zeroes": true, 00:20:13.142 "zcopy": true, 00:20:13.142 "get_zone_info": false, 00:20:13.142 "zone_management": false, 00:20:13.142 "zone_append": false, 00:20:13.142 "compare": false, 00:20:13.142 "compare_and_write": false, 00:20:13.142 "abort": true, 00:20:13.142 "seek_hole": false, 00:20:13.142 "seek_data": false, 00:20:13.142 "copy": true, 00:20:13.142 "nvme_iov_md": false 00:20:13.142 }, 00:20:13.142 "memory_domains": [ 00:20:13.142 { 00:20:13.142 "dma_device_id": "system", 00:20:13.142 "dma_device_type": 1 00:20:13.142 }, 00:20:13.142 { 00:20:13.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.142 "dma_device_type": 2 00:20:13.142 } 00:20:13.142 ], 00:20:13.142 "driver_specific": {} 00:20:13.142 } 00:20:13.142 ] 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.142 09:24:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:13.142 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.142 "name": "Existed_Raid", 00:20:13.142 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:13.142 "strip_size_kb": 64, 00:20:13.142 "state": "online", 00:20:13.142 "raid_level": "concat", 00:20:13.142 "superblock": true, 00:20:13.142 "num_base_bdevs": 4, 00:20:13.142 "num_base_bdevs_discovered": 4, 00:20:13.142 "num_base_bdevs_operational": 4, 00:20:13.142 "base_bdevs_list": [ 00:20:13.142 { 00:20:13.142 "name": "NewBaseBdev", 00:20:13.142 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:13.142 "is_configured": true, 00:20:13.142 "data_offset": 2048, 00:20:13.142 "data_size": 63488 00:20:13.142 }, 00:20:13.142 { 00:20:13.142 "name": "BaseBdev2", 00:20:13.142 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:13.142 "is_configured": true, 00:20:13.142 "data_offset": 2048, 00:20:13.142 "data_size": 63488 00:20:13.142 }, 00:20:13.142 { 00:20:13.142 "name": "BaseBdev3", 00:20:13.142 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:13.142 "is_configured": true, 00:20:13.142 "data_offset": 2048, 00:20:13.142 "data_size": 63488 00:20:13.142 }, 00:20:13.142 { 00:20:13.142 "name": "BaseBdev4", 00:20:13.142 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:13.142 "is_configured": true, 00:20:13.142 "data_offset": 2048, 00:20:13.142 "data_size": 63488 00:20:13.142 } 00:20:13.142 ] 00:20:13.142 }' 00:20:13.142 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.142 09:24:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:13.710 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:13.968 [2024-07-15 09:24:22.742678] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.968 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:13.968 "name": "Existed_Raid", 00:20:13.968 "aliases": [ 00:20:13.969 "6b744f24-014a-4be8-8700-855b8574c486" 00:20:13.969 ], 00:20:13.969 "product_name": "Raid Volume", 00:20:13.969 "block_size": 512, 00:20:13.969 "num_blocks": 253952, 00:20:13.969 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:13.969 "assigned_rate_limits": { 00:20:13.969 "rw_ios_per_sec": 0, 00:20:13.969 "rw_mbytes_per_sec": 0, 00:20:13.969 "r_mbytes_per_sec": 0, 00:20:13.969 "w_mbytes_per_sec": 0 00:20:13.969 }, 00:20:13.969 "claimed": false, 00:20:13.969 "zoned": false, 00:20:13.969 "supported_io_types": { 00:20:13.969 "read": true, 00:20:13.969 "write": true, 00:20:13.969 "unmap": true, 00:20:13.969 "flush": true, 00:20:13.969 "reset": true, 00:20:13.969 "nvme_admin": false, 00:20:13.969 "nvme_io": false, 00:20:13.969 "nvme_io_md": false, 00:20:13.969 "write_zeroes": true, 00:20:13.969 "zcopy": false, 00:20:13.969 "get_zone_info": false, 00:20:13.969 "zone_management": false, 00:20:13.969 "zone_append": false, 00:20:13.969 "compare": false, 00:20:13.969 "compare_and_write": false, 00:20:13.969 "abort": false, 00:20:13.969 "seek_hole": false, 00:20:13.969 "seek_data": false, 00:20:13.969 "copy": false, 00:20:13.969 "nvme_iov_md": false 00:20:13.969 }, 00:20:13.969 "memory_domains": [ 00:20:13.969 { 00:20:13.969 "dma_device_id": "system", 00:20:13.969 "dma_device_type": 1 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.969 "dma_device_type": 2 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "system", 00:20:13.969 "dma_device_type": 1 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.969 "dma_device_type": 2 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "system", 00:20:13.969 "dma_device_type": 1 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.969 "dma_device_type": 2 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "system", 00:20:13.969 "dma_device_type": 1 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.969 "dma_device_type": 2 00:20:13.969 } 00:20:13.969 ], 00:20:13.969 "driver_specific": { 00:20:13.969 "raid": { 00:20:13.969 "uuid": "6b744f24-014a-4be8-8700-855b8574c486", 00:20:13.969 "strip_size_kb": 64, 00:20:13.969 "state": "online", 00:20:13.969 "raid_level": "concat", 00:20:13.969 "superblock": true, 00:20:13.969 "num_base_bdevs": 4, 00:20:13.969 "num_base_bdevs_discovered": 4, 00:20:13.969 "num_base_bdevs_operational": 4, 00:20:13.969 "base_bdevs_list": [ 00:20:13.969 { 00:20:13.969 "name": "NewBaseBdev", 00:20:13.969 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:13.969 "is_configured": true, 00:20:13.969 "data_offset": 2048, 00:20:13.969 "data_size": 63488 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "name": "BaseBdev2", 00:20:13.969 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:13.969 "is_configured": true, 00:20:13.969 "data_offset": 2048, 00:20:13.969 "data_size": 63488 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "name": "BaseBdev3", 00:20:13.969 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:13.969 "is_configured": true, 00:20:13.969 "data_offset": 2048, 00:20:13.969 "data_size": 63488 00:20:13.969 }, 00:20:13.969 { 00:20:13.969 "name": "BaseBdev4", 00:20:13.969 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:13.969 "is_configured": true, 00:20:13.969 "data_offset": 2048, 00:20:13.969 "data_size": 63488 00:20:13.969 } 00:20:13.969 ] 00:20:13.969 } 00:20:13.969 } 00:20:13.969 }' 00:20:13.969 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:13.969 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:13.969 BaseBdev2 00:20:13.969 BaseBdev3 00:20:13.969 BaseBdev4' 00:20:13.969 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.969 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:13.969 09:24:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.227 "name": "NewBaseBdev", 00:20:14.227 "aliases": [ 00:20:14.227 "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c" 00:20:14.227 ], 00:20:14.227 "product_name": "Malloc disk", 00:20:14.227 "block_size": 512, 00:20:14.227 "num_blocks": 65536, 00:20:14.227 "uuid": "d7e6c2d4-3ff0-41d8-ba02-a70f09f93a4c", 00:20:14.227 "assigned_rate_limits": { 00:20:14.227 "rw_ios_per_sec": 0, 00:20:14.227 "rw_mbytes_per_sec": 0, 00:20:14.227 "r_mbytes_per_sec": 0, 00:20:14.227 "w_mbytes_per_sec": 0 00:20:14.227 }, 00:20:14.227 "claimed": true, 00:20:14.227 "claim_type": "exclusive_write", 00:20:14.227 "zoned": false, 00:20:14.227 "supported_io_types": { 00:20:14.227 "read": true, 00:20:14.227 "write": true, 00:20:14.227 "unmap": true, 00:20:14.227 "flush": true, 00:20:14.227 "reset": true, 00:20:14.227 "nvme_admin": false, 00:20:14.227 "nvme_io": false, 00:20:14.227 "nvme_io_md": false, 00:20:14.227 "write_zeroes": true, 00:20:14.227 "zcopy": true, 00:20:14.227 "get_zone_info": false, 00:20:14.227 "zone_management": false, 00:20:14.227 "zone_append": false, 00:20:14.227 "compare": false, 00:20:14.227 "compare_and_write": false, 00:20:14.227 "abort": true, 00:20:14.227 "seek_hole": false, 00:20:14.227 "seek_data": false, 00:20:14.227 "copy": true, 00:20:14.227 "nvme_iov_md": false 00:20:14.227 }, 00:20:14.227 "memory_domains": [ 00:20:14.227 { 00:20:14.227 "dma_device_id": "system", 00:20:14.227 "dma_device_type": 1 00:20:14.227 }, 00:20:14.227 { 00:20:14.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.227 "dma_device_type": 2 00:20:14.227 } 00:20:14.227 ], 00:20:14.227 "driver_specific": {} 00:20:14.227 }' 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.227 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.485 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:14.744 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.744 "name": "BaseBdev2", 00:20:14.744 "aliases": [ 00:20:14.744 "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134" 00:20:14.744 ], 00:20:14.744 "product_name": "Malloc disk", 00:20:14.744 "block_size": 512, 00:20:14.744 "num_blocks": 65536, 00:20:14.744 "uuid": "c303ebe1-02fe-4c18-8e0d-a34a4d5ae134", 00:20:14.744 "assigned_rate_limits": { 00:20:14.744 "rw_ios_per_sec": 0, 00:20:14.744 "rw_mbytes_per_sec": 0, 00:20:14.744 "r_mbytes_per_sec": 0, 00:20:14.744 "w_mbytes_per_sec": 0 00:20:14.744 }, 00:20:14.744 "claimed": true, 00:20:14.744 "claim_type": "exclusive_write", 00:20:14.744 "zoned": false, 00:20:14.744 "supported_io_types": { 00:20:14.744 "read": true, 00:20:14.744 "write": true, 00:20:14.744 "unmap": true, 00:20:14.744 "flush": true, 00:20:14.744 "reset": true, 00:20:14.744 "nvme_admin": false, 00:20:14.744 "nvme_io": false, 00:20:14.744 "nvme_io_md": false, 00:20:14.744 "write_zeroes": true, 00:20:14.744 "zcopy": true, 00:20:14.744 "get_zone_info": false, 00:20:14.744 "zone_management": false, 00:20:14.744 "zone_append": false, 00:20:14.744 "compare": false, 00:20:14.744 "compare_and_write": false, 00:20:14.744 "abort": true, 00:20:14.744 "seek_hole": false, 00:20:14.744 "seek_data": false, 00:20:14.744 "copy": true, 00:20:14.744 "nvme_iov_md": false 00:20:14.744 }, 00:20:14.744 "memory_domains": [ 00:20:14.744 { 00:20:14.744 "dma_device_id": "system", 00:20:14.744 "dma_device_type": 1 00:20:14.744 }, 00:20:14.744 { 00:20:14.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.744 "dma_device_type": 2 00:20:14.744 } 00:20:14.744 ], 00:20:14.744 "driver_specific": {} 00:20:14.744 }' 00:20:14.744 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.744 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.002 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.003 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.262 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.262 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.262 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:15.262 09:24:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.521 "name": "BaseBdev3", 00:20:15.521 "aliases": [ 00:20:15.521 "cab3ee15-2c80-4ba8-bcbc-34cf335587e1" 00:20:15.521 ], 00:20:15.521 "product_name": "Malloc disk", 00:20:15.521 "block_size": 512, 00:20:15.521 "num_blocks": 65536, 00:20:15.521 "uuid": "cab3ee15-2c80-4ba8-bcbc-34cf335587e1", 00:20:15.521 "assigned_rate_limits": { 00:20:15.521 "rw_ios_per_sec": 0, 00:20:15.521 "rw_mbytes_per_sec": 0, 00:20:15.521 "r_mbytes_per_sec": 0, 00:20:15.521 "w_mbytes_per_sec": 0 00:20:15.521 }, 00:20:15.521 "claimed": true, 00:20:15.521 "claim_type": "exclusive_write", 00:20:15.521 "zoned": false, 00:20:15.521 "supported_io_types": { 00:20:15.521 "read": true, 00:20:15.521 "write": true, 00:20:15.521 "unmap": true, 00:20:15.521 "flush": true, 00:20:15.521 "reset": true, 00:20:15.521 "nvme_admin": false, 00:20:15.521 "nvme_io": false, 00:20:15.521 "nvme_io_md": false, 00:20:15.521 "write_zeroes": true, 00:20:15.521 "zcopy": true, 00:20:15.521 "get_zone_info": false, 00:20:15.521 "zone_management": false, 00:20:15.521 "zone_append": false, 00:20:15.521 "compare": false, 00:20:15.521 "compare_and_write": false, 00:20:15.521 "abort": true, 00:20:15.521 "seek_hole": false, 00:20:15.521 "seek_data": false, 00:20:15.521 "copy": true, 00:20:15.521 "nvme_iov_md": false 00:20:15.521 }, 00:20:15.521 "memory_domains": [ 00:20:15.521 { 00:20:15.521 "dma_device_id": "system", 00:20:15.521 "dma_device_type": 1 00:20:15.521 }, 00:20:15.521 { 00:20:15.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.521 "dma_device_type": 2 00:20:15.521 } 00:20:15.521 ], 00:20:15.521 "driver_specific": {} 00:20:15.521 }' 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.521 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:15.780 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.039 "name": "BaseBdev4", 00:20:16.039 "aliases": [ 00:20:16.039 "e885a51a-0101-41fa-8521-a24c0cad4d7e" 00:20:16.039 ], 00:20:16.039 "product_name": "Malloc disk", 00:20:16.039 "block_size": 512, 00:20:16.039 "num_blocks": 65536, 00:20:16.039 "uuid": "e885a51a-0101-41fa-8521-a24c0cad4d7e", 00:20:16.039 "assigned_rate_limits": { 00:20:16.039 "rw_ios_per_sec": 0, 00:20:16.039 "rw_mbytes_per_sec": 0, 00:20:16.039 "r_mbytes_per_sec": 0, 00:20:16.039 "w_mbytes_per_sec": 0 00:20:16.039 }, 00:20:16.039 "claimed": true, 00:20:16.039 "claim_type": "exclusive_write", 00:20:16.039 "zoned": false, 00:20:16.039 "supported_io_types": { 00:20:16.039 "read": true, 00:20:16.039 "write": true, 00:20:16.039 "unmap": true, 00:20:16.039 "flush": true, 00:20:16.039 "reset": true, 00:20:16.039 "nvme_admin": false, 00:20:16.039 "nvme_io": false, 00:20:16.039 "nvme_io_md": false, 00:20:16.039 "write_zeroes": true, 00:20:16.039 "zcopy": true, 00:20:16.039 "get_zone_info": false, 00:20:16.039 "zone_management": false, 00:20:16.039 "zone_append": false, 00:20:16.039 "compare": false, 00:20:16.039 "compare_and_write": false, 00:20:16.039 "abort": true, 00:20:16.039 "seek_hole": false, 00:20:16.039 "seek_data": false, 00:20:16.039 "copy": true, 00:20:16.039 "nvme_iov_md": false 00:20:16.039 }, 00:20:16.039 "memory_domains": [ 00:20:16.039 { 00:20:16.039 "dma_device_id": "system", 00:20:16.039 "dma_device_type": 1 00:20:16.039 }, 00:20:16.039 { 00:20:16.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.039 "dma_device_type": 2 00:20:16.039 } 00:20:16.039 ], 00:20:16.039 "driver_specific": {} 00:20:16.039 }' 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.039 09:24:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.298 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:16.557 [2024-07-15 09:24:25.361353] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:16.557 [2024-07-15 09:24:25.361378] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:16.557 [2024-07-15 09:24:25.361430] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:16.557 [2024-07-15 09:24:25.361490] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:16.557 [2024-07-15 09:24:25.361502] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcca850 name Existed_Raid, state offline 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 164517 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 164517 ']' 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 164517 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 164517 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 164517' 00:20:16.557 killing process with pid 164517 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 164517 00:20:16.557 [2024-07-15 09:24:25.428346] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:16.557 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 164517 00:20:16.557 [2024-07-15 09:24:25.470729] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:16.816 09:24:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:16.816 00:20:16.816 real 0m32.045s 00:20:16.816 user 0m58.782s 00:20:16.816 sys 0m5.627s 00:20:16.816 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:16.816 09:24:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:16.816 ************************************ 00:20:16.816 END TEST raid_state_function_test_sb 00:20:16.816 ************************************ 00:20:16.816 09:24:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:16.816 09:24:25 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:16.816 09:24:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:16.816 09:24:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:16.816 09:24:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:17.075 ************************************ 00:20:17.075 START TEST raid_superblock_test 00:20:17.075 ************************************ 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:17.075 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=169877 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 169877 /var/tmp/spdk-raid.sock 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 169877 ']' 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:17.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:17.076 09:24:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.076 [2024-07-15 09:24:25.838528] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:20:17.076 [2024-07-15 09:24:25.838597] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid169877 ] 00:20:17.076 [2024-07-15 09:24:25.970259] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:17.335 [2024-07-15 09:24:26.069654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:17.335 [2024-07-15 09:24:26.133085] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:17.335 [2024-07-15 09:24:26.133132] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:17.902 09:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:17.902 09:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:17.903 09:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:18.162 malloc1 00:20:18.162 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:18.420 [2024-07-15 09:24:27.242153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:18.420 [2024-07-15 09:24:27.242203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:18.420 [2024-07-15 09:24:27.242222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4a570 00:20:18.420 [2024-07-15 09:24:27.242235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:18.421 [2024-07-15 09:24:27.243818] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:18.421 [2024-07-15 09:24:27.243850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:18.421 pt1 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:18.421 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:18.680 malloc2 00:20:18.680 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:18.939 [2024-07-15 09:24:27.736219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:18.939 [2024-07-15 09:24:27.736264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:18.939 [2024-07-15 09:24:27.736281] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4b970 00:20:18.939 [2024-07-15 09:24:27.736293] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:18.939 [2024-07-15 09:24:27.737717] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:18.939 [2024-07-15 09:24:27.737747] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:18.939 pt2 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:18.939 09:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:19.198 malloc3 00:20:19.198 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:19.457 [2024-07-15 09:24:28.234132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:19.457 [2024-07-15 09:24:28.234181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.457 [2024-07-15 09:24:28.234200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e2340 00:20:19.457 [2024-07-15 09:24:28.234214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.457 [2024-07-15 09:24:28.235677] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.457 [2024-07-15 09:24:28.235706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:19.457 pt3 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:19.457 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:19.716 malloc4 00:20:19.716 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:19.976 [2024-07-15 09:24:28.741392] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:19.976 [2024-07-15 09:24:28.741442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.976 [2024-07-15 09:24:28.741461] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e4c60 00:20:19.976 [2024-07-15 09:24:28.741474] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.976 [2024-07-15 09:24:28.742920] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.976 [2024-07-15 09:24:28.742956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:19.976 pt4 00:20:19.976 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:19.976 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:19.976 09:24:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:20.236 [2024-07-15 09:24:28.986073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:20.236 [2024-07-15 09:24:28.987314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:20.236 [2024-07-15 09:24:28.987368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:20.236 [2024-07-15 09:24:28.987411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:20.236 [2024-07-15 09:24:28.987577] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f42530 00:20:20.236 [2024-07-15 09:24:28.987588] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:20.236 [2024-07-15 09:24:28.987782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f40770 00:20:20.236 [2024-07-15 09:24:28.987936] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f42530 00:20:20.236 [2024-07-15 09:24:28.987947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f42530 00:20:20.236 [2024-07-15 09:24:28.988045] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.236 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.495 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.495 "name": "raid_bdev1", 00:20:20.495 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:20.495 "strip_size_kb": 64, 00:20:20.495 "state": "online", 00:20:20.495 "raid_level": "concat", 00:20:20.495 "superblock": true, 00:20:20.495 "num_base_bdevs": 4, 00:20:20.495 "num_base_bdevs_discovered": 4, 00:20:20.495 "num_base_bdevs_operational": 4, 00:20:20.495 "base_bdevs_list": [ 00:20:20.495 { 00:20:20.495 "name": "pt1", 00:20:20.495 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:20.495 "is_configured": true, 00:20:20.495 "data_offset": 2048, 00:20:20.495 "data_size": 63488 00:20:20.495 }, 00:20:20.495 { 00:20:20.495 "name": "pt2", 00:20:20.495 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:20.495 "is_configured": true, 00:20:20.495 "data_offset": 2048, 00:20:20.495 "data_size": 63488 00:20:20.495 }, 00:20:20.495 { 00:20:20.495 "name": "pt3", 00:20:20.495 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:20.495 "is_configured": true, 00:20:20.495 "data_offset": 2048, 00:20:20.495 "data_size": 63488 00:20:20.495 }, 00:20:20.495 { 00:20:20.495 "name": "pt4", 00:20:20.495 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:20.495 "is_configured": true, 00:20:20.495 "data_offset": 2048, 00:20:20.495 "data_size": 63488 00:20:20.495 } 00:20:20.495 ] 00:20:20.496 }' 00:20:20.496 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.496 09:24:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:21.061 09:24:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:21.319 [2024-07-15 09:24:30.049295] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:21.319 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:21.319 "name": "raid_bdev1", 00:20:21.319 "aliases": [ 00:20:21.319 "67726cea-7f34-4061-a338-8e92fd408473" 00:20:21.319 ], 00:20:21.319 "product_name": "Raid Volume", 00:20:21.319 "block_size": 512, 00:20:21.319 "num_blocks": 253952, 00:20:21.319 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:21.319 "assigned_rate_limits": { 00:20:21.319 "rw_ios_per_sec": 0, 00:20:21.319 "rw_mbytes_per_sec": 0, 00:20:21.319 "r_mbytes_per_sec": 0, 00:20:21.319 "w_mbytes_per_sec": 0 00:20:21.319 }, 00:20:21.319 "claimed": false, 00:20:21.319 "zoned": false, 00:20:21.319 "supported_io_types": { 00:20:21.319 "read": true, 00:20:21.319 "write": true, 00:20:21.319 "unmap": true, 00:20:21.319 "flush": true, 00:20:21.319 "reset": true, 00:20:21.319 "nvme_admin": false, 00:20:21.319 "nvme_io": false, 00:20:21.319 "nvme_io_md": false, 00:20:21.319 "write_zeroes": true, 00:20:21.319 "zcopy": false, 00:20:21.319 "get_zone_info": false, 00:20:21.319 "zone_management": false, 00:20:21.319 "zone_append": false, 00:20:21.319 "compare": false, 00:20:21.319 "compare_and_write": false, 00:20:21.319 "abort": false, 00:20:21.319 "seek_hole": false, 00:20:21.319 "seek_data": false, 00:20:21.319 "copy": false, 00:20:21.319 "nvme_iov_md": false 00:20:21.319 }, 00:20:21.319 "memory_domains": [ 00:20:21.319 { 00:20:21.319 "dma_device_id": "system", 00:20:21.319 "dma_device_type": 1 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.319 "dma_device_type": 2 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "system", 00:20:21.319 "dma_device_type": 1 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.319 "dma_device_type": 2 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "system", 00:20:21.319 "dma_device_type": 1 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.319 "dma_device_type": 2 00:20:21.319 }, 00:20:21.319 { 00:20:21.319 "dma_device_id": "system", 00:20:21.319 "dma_device_type": 1 00:20:21.319 }, 00:20:21.319 { 00:20:21.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.320 "dma_device_type": 2 00:20:21.320 } 00:20:21.320 ], 00:20:21.320 "driver_specific": { 00:20:21.320 "raid": { 00:20:21.320 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:21.320 "strip_size_kb": 64, 00:20:21.320 "state": "online", 00:20:21.320 "raid_level": "concat", 00:20:21.320 "superblock": true, 00:20:21.320 "num_base_bdevs": 4, 00:20:21.320 "num_base_bdevs_discovered": 4, 00:20:21.320 "num_base_bdevs_operational": 4, 00:20:21.320 "base_bdevs_list": [ 00:20:21.320 { 00:20:21.320 "name": "pt1", 00:20:21.320 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:21.320 "is_configured": true, 00:20:21.320 "data_offset": 2048, 00:20:21.320 "data_size": 63488 00:20:21.320 }, 00:20:21.320 { 00:20:21.320 "name": "pt2", 00:20:21.320 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:21.320 "is_configured": true, 00:20:21.320 "data_offset": 2048, 00:20:21.320 "data_size": 63488 00:20:21.320 }, 00:20:21.320 { 00:20:21.320 "name": "pt3", 00:20:21.320 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:21.320 "is_configured": true, 00:20:21.320 "data_offset": 2048, 00:20:21.320 "data_size": 63488 00:20:21.320 }, 00:20:21.320 { 00:20:21.320 "name": "pt4", 00:20:21.320 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:21.320 "is_configured": true, 00:20:21.320 "data_offset": 2048, 00:20:21.320 "data_size": 63488 00:20:21.320 } 00:20:21.320 ] 00:20:21.320 } 00:20:21.320 } 00:20:21.320 }' 00:20:21.320 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:21.320 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:21.320 pt2 00:20:21.320 pt3 00:20:21.320 pt4' 00:20:21.320 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:21.320 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:21.320 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:21.578 "name": "pt1", 00:20:21.578 "aliases": [ 00:20:21.578 "00000000-0000-0000-0000-000000000001" 00:20:21.578 ], 00:20:21.578 "product_name": "passthru", 00:20:21.578 "block_size": 512, 00:20:21.578 "num_blocks": 65536, 00:20:21.578 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:21.578 "assigned_rate_limits": { 00:20:21.578 "rw_ios_per_sec": 0, 00:20:21.578 "rw_mbytes_per_sec": 0, 00:20:21.578 "r_mbytes_per_sec": 0, 00:20:21.578 "w_mbytes_per_sec": 0 00:20:21.578 }, 00:20:21.578 "claimed": true, 00:20:21.578 "claim_type": "exclusive_write", 00:20:21.578 "zoned": false, 00:20:21.578 "supported_io_types": { 00:20:21.578 "read": true, 00:20:21.578 "write": true, 00:20:21.578 "unmap": true, 00:20:21.578 "flush": true, 00:20:21.578 "reset": true, 00:20:21.578 "nvme_admin": false, 00:20:21.578 "nvme_io": false, 00:20:21.578 "nvme_io_md": false, 00:20:21.578 "write_zeroes": true, 00:20:21.578 "zcopy": true, 00:20:21.578 "get_zone_info": false, 00:20:21.578 "zone_management": false, 00:20:21.578 "zone_append": false, 00:20:21.578 "compare": false, 00:20:21.578 "compare_and_write": false, 00:20:21.578 "abort": true, 00:20:21.578 "seek_hole": false, 00:20:21.578 "seek_data": false, 00:20:21.578 "copy": true, 00:20:21.578 "nvme_iov_md": false 00:20:21.578 }, 00:20:21.578 "memory_domains": [ 00:20:21.578 { 00:20:21.578 "dma_device_id": "system", 00:20:21.578 "dma_device_type": 1 00:20:21.578 }, 00:20:21.578 { 00:20:21.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.578 "dma_device_type": 2 00:20:21.578 } 00:20:21.578 ], 00:20:21.578 "driver_specific": { 00:20:21.578 "passthru": { 00:20:21.578 "name": "pt1", 00:20:21.578 "base_bdev_name": "malloc1" 00:20:21.578 } 00:20:21.578 } 00:20:21.578 }' 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:21.578 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:21.837 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.096 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.096 "name": "pt2", 00:20:22.096 "aliases": [ 00:20:22.096 "00000000-0000-0000-0000-000000000002" 00:20:22.096 ], 00:20:22.096 "product_name": "passthru", 00:20:22.096 "block_size": 512, 00:20:22.096 "num_blocks": 65536, 00:20:22.096 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.096 "assigned_rate_limits": { 00:20:22.096 "rw_ios_per_sec": 0, 00:20:22.096 "rw_mbytes_per_sec": 0, 00:20:22.096 "r_mbytes_per_sec": 0, 00:20:22.096 "w_mbytes_per_sec": 0 00:20:22.096 }, 00:20:22.096 "claimed": true, 00:20:22.096 "claim_type": "exclusive_write", 00:20:22.096 "zoned": false, 00:20:22.096 "supported_io_types": { 00:20:22.096 "read": true, 00:20:22.096 "write": true, 00:20:22.096 "unmap": true, 00:20:22.096 "flush": true, 00:20:22.096 "reset": true, 00:20:22.096 "nvme_admin": false, 00:20:22.096 "nvme_io": false, 00:20:22.096 "nvme_io_md": false, 00:20:22.096 "write_zeroes": true, 00:20:22.096 "zcopy": true, 00:20:22.096 "get_zone_info": false, 00:20:22.096 "zone_management": false, 00:20:22.096 "zone_append": false, 00:20:22.096 "compare": false, 00:20:22.096 "compare_and_write": false, 00:20:22.096 "abort": true, 00:20:22.096 "seek_hole": false, 00:20:22.096 "seek_data": false, 00:20:22.096 "copy": true, 00:20:22.096 "nvme_iov_md": false 00:20:22.096 }, 00:20:22.096 "memory_domains": [ 00:20:22.096 { 00:20:22.096 "dma_device_id": "system", 00:20:22.096 "dma_device_type": 1 00:20:22.096 }, 00:20:22.096 { 00:20:22.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.096 "dma_device_type": 2 00:20:22.096 } 00:20:22.096 ], 00:20:22.096 "driver_specific": { 00:20:22.096 "passthru": { 00:20:22.096 "name": "pt2", 00:20:22.096 "base_bdev_name": "malloc2" 00:20:22.096 } 00:20:22.096 } 00:20:22.096 }' 00:20:22.096 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.096 09:24:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.096 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.096 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:22.355 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.692 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.692 "name": "pt3", 00:20:22.692 "aliases": [ 00:20:22.692 "00000000-0000-0000-0000-000000000003" 00:20:22.692 ], 00:20:22.692 "product_name": "passthru", 00:20:22.692 "block_size": 512, 00:20:22.692 "num_blocks": 65536, 00:20:22.692 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:22.692 "assigned_rate_limits": { 00:20:22.692 "rw_ios_per_sec": 0, 00:20:22.692 "rw_mbytes_per_sec": 0, 00:20:22.692 "r_mbytes_per_sec": 0, 00:20:22.692 "w_mbytes_per_sec": 0 00:20:22.692 }, 00:20:22.692 "claimed": true, 00:20:22.692 "claim_type": "exclusive_write", 00:20:22.692 "zoned": false, 00:20:22.692 "supported_io_types": { 00:20:22.692 "read": true, 00:20:22.692 "write": true, 00:20:22.692 "unmap": true, 00:20:22.692 "flush": true, 00:20:22.692 "reset": true, 00:20:22.692 "nvme_admin": false, 00:20:22.692 "nvme_io": false, 00:20:22.692 "nvme_io_md": false, 00:20:22.692 "write_zeroes": true, 00:20:22.692 "zcopy": true, 00:20:22.692 "get_zone_info": false, 00:20:22.692 "zone_management": false, 00:20:22.692 "zone_append": false, 00:20:22.692 "compare": false, 00:20:22.692 "compare_and_write": false, 00:20:22.692 "abort": true, 00:20:22.692 "seek_hole": false, 00:20:22.692 "seek_data": false, 00:20:22.692 "copy": true, 00:20:22.692 "nvme_iov_md": false 00:20:22.692 }, 00:20:22.692 "memory_domains": [ 00:20:22.692 { 00:20:22.692 "dma_device_id": "system", 00:20:22.692 "dma_device_type": 1 00:20:22.692 }, 00:20:22.692 { 00:20:22.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.692 "dma_device_type": 2 00:20:22.692 } 00:20:22.692 ], 00:20:22.692 "driver_specific": { 00:20:22.692 "passthru": { 00:20:22.692 "name": "pt3", 00:20:22.692 "base_bdev_name": "malloc3" 00:20:22.692 } 00:20:22.692 } 00:20:22.692 }' 00:20:22.692 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.692 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.692 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.692 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:22.951 09:24:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.211 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.211 "name": "pt4", 00:20:23.211 "aliases": [ 00:20:23.211 "00000000-0000-0000-0000-000000000004" 00:20:23.211 ], 00:20:23.211 "product_name": "passthru", 00:20:23.211 "block_size": 512, 00:20:23.211 "num_blocks": 65536, 00:20:23.211 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:23.211 "assigned_rate_limits": { 00:20:23.211 "rw_ios_per_sec": 0, 00:20:23.211 "rw_mbytes_per_sec": 0, 00:20:23.211 "r_mbytes_per_sec": 0, 00:20:23.211 "w_mbytes_per_sec": 0 00:20:23.211 }, 00:20:23.211 "claimed": true, 00:20:23.211 "claim_type": "exclusive_write", 00:20:23.211 "zoned": false, 00:20:23.211 "supported_io_types": { 00:20:23.211 "read": true, 00:20:23.211 "write": true, 00:20:23.211 "unmap": true, 00:20:23.211 "flush": true, 00:20:23.211 "reset": true, 00:20:23.211 "nvme_admin": false, 00:20:23.211 "nvme_io": false, 00:20:23.211 "nvme_io_md": false, 00:20:23.211 "write_zeroes": true, 00:20:23.211 "zcopy": true, 00:20:23.211 "get_zone_info": false, 00:20:23.211 "zone_management": false, 00:20:23.211 "zone_append": false, 00:20:23.211 "compare": false, 00:20:23.211 "compare_and_write": false, 00:20:23.211 "abort": true, 00:20:23.211 "seek_hole": false, 00:20:23.211 "seek_data": false, 00:20:23.211 "copy": true, 00:20:23.211 "nvme_iov_md": false 00:20:23.211 }, 00:20:23.211 "memory_domains": [ 00:20:23.211 { 00:20:23.211 "dma_device_id": "system", 00:20:23.211 "dma_device_type": 1 00:20:23.211 }, 00:20:23.211 { 00:20:23.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.211 "dma_device_type": 2 00:20:23.211 } 00:20:23.211 ], 00:20:23.211 "driver_specific": { 00:20:23.211 "passthru": { 00:20:23.211 "name": "pt4", 00:20:23.211 "base_bdev_name": "malloc4" 00:20:23.211 } 00:20:23.211 } 00:20:23.211 }' 00:20:23.211 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.470 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.729 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.729 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:23.729 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:23.729 [2024-07-15 09:24:32.676114] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:23.989 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=67726cea-7f34-4061-a338-8e92fd408473 00:20:23.989 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 67726cea-7f34-4061-a338-8e92fd408473 ']' 00:20:23.989 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:23.989 [2024-07-15 09:24:32.912437] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:23.989 [2024-07-15 09:24:32.912461] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:23.989 [2024-07-15 09:24:32.912516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:23.989 [2024-07-15 09:24:32.912581] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:23.989 [2024-07-15 09:24:32.912593] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f42530 name raid_bdev1, state offline 00:20:23.989 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.989 09:24:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:24.248 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:24.248 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:24.248 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:24.248 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:24.506 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:24.506 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:24.764 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:24.764 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:25.023 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:25.023 09:24:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:25.282 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:25.282 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:25.540 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:25.799 [2024-07-15 09:24:34.608875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:25.799 [2024-07-15 09:24:34.610282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:25.799 [2024-07-15 09:24:34.610327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:25.799 [2024-07-15 09:24:34.610361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:25.799 [2024-07-15 09:24:34.610408] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:25.799 [2024-07-15 09:24:34.610451] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:25.799 [2024-07-15 09:24:34.610473] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:25.799 [2024-07-15 09:24:34.610495] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:25.799 [2024-07-15 09:24:34.610513] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:25.799 [2024-07-15 09:24:34.610524] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20edff0 name raid_bdev1, state configuring 00:20:25.799 request: 00:20:25.799 { 00:20:25.799 "name": "raid_bdev1", 00:20:25.799 "raid_level": "concat", 00:20:25.799 "base_bdevs": [ 00:20:25.799 "malloc1", 00:20:25.799 "malloc2", 00:20:25.799 "malloc3", 00:20:25.799 "malloc4" 00:20:25.799 ], 00:20:25.799 "strip_size_kb": 64, 00:20:25.799 "superblock": false, 00:20:25.799 "method": "bdev_raid_create", 00:20:25.799 "req_id": 1 00:20:25.799 } 00:20:25.799 Got JSON-RPC error response 00:20:25.799 response: 00:20:25.799 { 00:20:25.799 "code": -17, 00:20:25.799 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:25.799 } 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.799 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:26.069 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:26.069 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:26.069 09:24:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:26.326 [2024-07-15 09:24:35.102115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:26.326 [2024-07-15 09:24:35.102166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.326 [2024-07-15 09:24:35.102187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4a7a0 00:20:26.326 [2024-07-15 09:24:35.102200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.326 [2024-07-15 09:24:35.103883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.326 [2024-07-15 09:24:35.103915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:26.326 [2024-07-15 09:24:35.103995] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:26.326 [2024-07-15 09:24:35.104023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:26.326 pt1 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.326 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.583 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.583 "name": "raid_bdev1", 00:20:26.583 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:26.583 "strip_size_kb": 64, 00:20:26.583 "state": "configuring", 00:20:26.583 "raid_level": "concat", 00:20:26.583 "superblock": true, 00:20:26.583 "num_base_bdevs": 4, 00:20:26.583 "num_base_bdevs_discovered": 1, 00:20:26.583 "num_base_bdevs_operational": 4, 00:20:26.583 "base_bdevs_list": [ 00:20:26.583 { 00:20:26.583 "name": "pt1", 00:20:26.583 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:26.583 "is_configured": true, 00:20:26.583 "data_offset": 2048, 00:20:26.583 "data_size": 63488 00:20:26.583 }, 00:20:26.583 { 00:20:26.583 "name": null, 00:20:26.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:26.583 "is_configured": false, 00:20:26.583 "data_offset": 2048, 00:20:26.583 "data_size": 63488 00:20:26.583 }, 00:20:26.583 { 00:20:26.583 "name": null, 00:20:26.583 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:26.583 "is_configured": false, 00:20:26.583 "data_offset": 2048, 00:20:26.583 "data_size": 63488 00:20:26.583 }, 00:20:26.583 { 00:20:26.583 "name": null, 00:20:26.583 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:26.583 "is_configured": false, 00:20:26.584 "data_offset": 2048, 00:20:26.584 "data_size": 63488 00:20:26.584 } 00:20:26.584 ] 00:20:26.584 }' 00:20:26.584 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.584 09:24:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.148 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:27.148 09:24:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:27.406 [2024-07-15 09:24:36.176973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:27.406 [2024-07-15 09:24:36.177027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:27.406 [2024-07-15 09:24:36.177044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f41ea0 00:20:27.406 [2024-07-15 09:24:36.177057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:27.406 [2024-07-15 09:24:36.177429] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:27.406 [2024-07-15 09:24:36.177449] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:27.406 [2024-07-15 09:24:36.177519] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:27.406 [2024-07-15 09:24:36.177539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:27.406 pt2 00:20:27.406 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:27.663 [2024-07-15 09:24:36.417615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.663 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.920 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.920 "name": "raid_bdev1", 00:20:27.920 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:27.920 "strip_size_kb": 64, 00:20:27.920 "state": "configuring", 00:20:27.920 "raid_level": "concat", 00:20:27.920 "superblock": true, 00:20:27.920 "num_base_bdevs": 4, 00:20:27.920 "num_base_bdevs_discovered": 1, 00:20:27.920 "num_base_bdevs_operational": 4, 00:20:27.920 "base_bdevs_list": [ 00:20:27.920 { 00:20:27.920 "name": "pt1", 00:20:27.920 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:27.920 "is_configured": true, 00:20:27.920 "data_offset": 2048, 00:20:27.920 "data_size": 63488 00:20:27.920 }, 00:20:27.920 { 00:20:27.920 "name": null, 00:20:27.920 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:27.920 "is_configured": false, 00:20:27.920 "data_offset": 2048, 00:20:27.920 "data_size": 63488 00:20:27.920 }, 00:20:27.920 { 00:20:27.920 "name": null, 00:20:27.920 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:27.920 "is_configured": false, 00:20:27.920 "data_offset": 2048, 00:20:27.920 "data_size": 63488 00:20:27.920 }, 00:20:27.920 { 00:20:27.920 "name": null, 00:20:27.920 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:27.920 "is_configured": false, 00:20:27.920 "data_offset": 2048, 00:20:27.920 "data_size": 63488 00:20:27.920 } 00:20:27.920 ] 00:20:27.920 }' 00:20:27.920 09:24:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.920 09:24:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.485 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:28.485 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:28.485 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:28.743 [2024-07-15 09:24:37.464375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:28.743 [2024-07-15 09:24:37.464426] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.743 [2024-07-15 09:24:37.464445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f40ec0 00:20:28.743 [2024-07-15 09:24:37.464457] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.743 [2024-07-15 09:24:37.464800] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.743 [2024-07-15 09:24:37.464820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:28.743 [2024-07-15 09:24:37.464883] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:28.743 [2024-07-15 09:24:37.464902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:28.743 pt2 00:20:28.743 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:28.743 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:28.743 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:29.001 [2024-07-15 09:24:37.709036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:29.001 [2024-07-15 09:24:37.709074] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.001 [2024-07-15 09:24:37.709089] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f410f0 00:20:29.001 [2024-07-15 09:24:37.709102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.001 [2024-07-15 09:24:37.709420] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.001 [2024-07-15 09:24:37.709437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:29.001 [2024-07-15 09:24:37.709494] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:29.002 [2024-07-15 09:24:37.709512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:29.002 pt3 00:20:29.002 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:29.002 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:29.002 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:29.002 [2024-07-15 09:24:37.953691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:29.002 [2024-07-15 09:24:37.953740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.002 [2024-07-15 09:24:37.953756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f49af0 00:20:29.002 [2024-07-15 09:24:37.953769] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.002 [2024-07-15 09:24:37.954106] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.002 [2024-07-15 09:24:37.954125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:29.002 [2024-07-15 09:24:37.954183] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:29.002 [2024-07-15 09:24:37.954203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:29.002 [2024-07-15 09:24:37.954327] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f438f0 00:20:29.002 [2024-07-15 09:24:37.954345] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:29.002 [2024-07-15 09:24:37.954520] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f43150 00:20:29.002 [2024-07-15 09:24:37.954652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f438f0 00:20:29.002 [2024-07-15 09:24:37.954661] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f438f0 00:20:29.260 [2024-07-15 09:24:37.954760] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.260 pt4 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.260 09:24:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.518 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.518 "name": "raid_bdev1", 00:20:29.518 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:29.518 "strip_size_kb": 64, 00:20:29.518 "state": "online", 00:20:29.518 "raid_level": "concat", 00:20:29.518 "superblock": true, 00:20:29.518 "num_base_bdevs": 4, 00:20:29.518 "num_base_bdevs_discovered": 4, 00:20:29.518 "num_base_bdevs_operational": 4, 00:20:29.518 "base_bdevs_list": [ 00:20:29.518 { 00:20:29.518 "name": "pt1", 00:20:29.518 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:29.518 "is_configured": true, 00:20:29.518 "data_offset": 2048, 00:20:29.518 "data_size": 63488 00:20:29.518 }, 00:20:29.518 { 00:20:29.518 "name": "pt2", 00:20:29.518 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:29.518 "is_configured": true, 00:20:29.518 "data_offset": 2048, 00:20:29.518 "data_size": 63488 00:20:29.518 }, 00:20:29.518 { 00:20:29.518 "name": "pt3", 00:20:29.518 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:29.518 "is_configured": true, 00:20:29.518 "data_offset": 2048, 00:20:29.518 "data_size": 63488 00:20:29.518 }, 00:20:29.518 { 00:20:29.518 "name": "pt4", 00:20:29.518 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:29.518 "is_configured": true, 00:20:29.518 "data_offset": 2048, 00:20:29.519 "data_size": 63488 00:20:29.519 } 00:20:29.519 ] 00:20:29.519 }' 00:20:29.519 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.519 09:24:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.084 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:30.084 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:30.084 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:30.084 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:30.084 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:30.085 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:30.085 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:30.085 09:24:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:30.342 [2024-07-15 09:24:39.040854] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:30.342 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:30.343 "name": "raid_bdev1", 00:20:30.343 "aliases": [ 00:20:30.343 "67726cea-7f34-4061-a338-8e92fd408473" 00:20:30.343 ], 00:20:30.343 "product_name": "Raid Volume", 00:20:30.343 "block_size": 512, 00:20:30.343 "num_blocks": 253952, 00:20:30.343 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:30.343 "assigned_rate_limits": { 00:20:30.343 "rw_ios_per_sec": 0, 00:20:30.343 "rw_mbytes_per_sec": 0, 00:20:30.343 "r_mbytes_per_sec": 0, 00:20:30.343 "w_mbytes_per_sec": 0 00:20:30.343 }, 00:20:30.343 "claimed": false, 00:20:30.343 "zoned": false, 00:20:30.343 "supported_io_types": { 00:20:30.343 "read": true, 00:20:30.343 "write": true, 00:20:30.343 "unmap": true, 00:20:30.343 "flush": true, 00:20:30.343 "reset": true, 00:20:30.343 "nvme_admin": false, 00:20:30.343 "nvme_io": false, 00:20:30.343 "nvme_io_md": false, 00:20:30.343 "write_zeroes": true, 00:20:30.343 "zcopy": false, 00:20:30.343 "get_zone_info": false, 00:20:30.343 "zone_management": false, 00:20:30.343 "zone_append": false, 00:20:30.343 "compare": false, 00:20:30.343 "compare_and_write": false, 00:20:30.343 "abort": false, 00:20:30.343 "seek_hole": false, 00:20:30.343 "seek_data": false, 00:20:30.343 "copy": false, 00:20:30.343 "nvme_iov_md": false 00:20:30.343 }, 00:20:30.343 "memory_domains": [ 00:20:30.343 { 00:20:30.343 "dma_device_id": "system", 00:20:30.343 "dma_device_type": 1 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.343 "dma_device_type": 2 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "system", 00:20:30.343 "dma_device_type": 1 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.343 "dma_device_type": 2 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "system", 00:20:30.343 "dma_device_type": 1 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.343 "dma_device_type": 2 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "system", 00:20:30.343 "dma_device_type": 1 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.343 "dma_device_type": 2 00:20:30.343 } 00:20:30.343 ], 00:20:30.343 "driver_specific": { 00:20:30.343 "raid": { 00:20:30.343 "uuid": "67726cea-7f34-4061-a338-8e92fd408473", 00:20:30.343 "strip_size_kb": 64, 00:20:30.343 "state": "online", 00:20:30.343 "raid_level": "concat", 00:20:30.343 "superblock": true, 00:20:30.343 "num_base_bdevs": 4, 00:20:30.343 "num_base_bdevs_discovered": 4, 00:20:30.343 "num_base_bdevs_operational": 4, 00:20:30.343 "base_bdevs_list": [ 00:20:30.343 { 00:20:30.343 "name": "pt1", 00:20:30.343 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:30.343 "is_configured": true, 00:20:30.343 "data_offset": 2048, 00:20:30.343 "data_size": 63488 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "name": "pt2", 00:20:30.343 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:30.343 "is_configured": true, 00:20:30.343 "data_offset": 2048, 00:20:30.343 "data_size": 63488 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "name": "pt3", 00:20:30.343 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:30.343 "is_configured": true, 00:20:30.343 "data_offset": 2048, 00:20:30.343 "data_size": 63488 00:20:30.343 }, 00:20:30.343 { 00:20:30.343 "name": "pt4", 00:20:30.343 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:30.343 "is_configured": true, 00:20:30.343 "data_offset": 2048, 00:20:30.343 "data_size": 63488 00:20:30.343 } 00:20:30.343 ] 00:20:30.343 } 00:20:30.343 } 00:20:30.343 }' 00:20:30.343 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:30.343 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:30.343 pt2 00:20:30.343 pt3 00:20:30.343 pt4' 00:20:30.343 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.343 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:30.343 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.600 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.600 "name": "pt1", 00:20:30.600 "aliases": [ 00:20:30.600 "00000000-0000-0000-0000-000000000001" 00:20:30.600 ], 00:20:30.600 "product_name": "passthru", 00:20:30.600 "block_size": 512, 00:20:30.600 "num_blocks": 65536, 00:20:30.600 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:30.600 "assigned_rate_limits": { 00:20:30.601 "rw_ios_per_sec": 0, 00:20:30.601 "rw_mbytes_per_sec": 0, 00:20:30.601 "r_mbytes_per_sec": 0, 00:20:30.601 "w_mbytes_per_sec": 0 00:20:30.601 }, 00:20:30.601 "claimed": true, 00:20:30.601 "claim_type": "exclusive_write", 00:20:30.601 "zoned": false, 00:20:30.601 "supported_io_types": { 00:20:30.601 "read": true, 00:20:30.601 "write": true, 00:20:30.601 "unmap": true, 00:20:30.601 "flush": true, 00:20:30.601 "reset": true, 00:20:30.601 "nvme_admin": false, 00:20:30.601 "nvme_io": false, 00:20:30.601 "nvme_io_md": false, 00:20:30.601 "write_zeroes": true, 00:20:30.601 "zcopy": true, 00:20:30.601 "get_zone_info": false, 00:20:30.601 "zone_management": false, 00:20:30.601 "zone_append": false, 00:20:30.601 "compare": false, 00:20:30.601 "compare_and_write": false, 00:20:30.601 "abort": true, 00:20:30.601 "seek_hole": false, 00:20:30.601 "seek_data": false, 00:20:30.601 "copy": true, 00:20:30.601 "nvme_iov_md": false 00:20:30.601 }, 00:20:30.601 "memory_domains": [ 00:20:30.601 { 00:20:30.601 "dma_device_id": "system", 00:20:30.601 "dma_device_type": 1 00:20:30.601 }, 00:20:30.601 { 00:20:30.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.601 "dma_device_type": 2 00:20:30.601 } 00:20:30.601 ], 00:20:30.601 "driver_specific": { 00:20:30.601 "passthru": { 00:20:30.601 "name": "pt1", 00:20:30.601 "base_bdev_name": "malloc1" 00:20:30.601 } 00:20:30.601 } 00:20:30.601 }' 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:30.601 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:30.859 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.117 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.117 "name": "pt2", 00:20:31.117 "aliases": [ 00:20:31.117 "00000000-0000-0000-0000-000000000002" 00:20:31.117 ], 00:20:31.117 "product_name": "passthru", 00:20:31.117 "block_size": 512, 00:20:31.117 "num_blocks": 65536, 00:20:31.117 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:31.117 "assigned_rate_limits": { 00:20:31.117 "rw_ios_per_sec": 0, 00:20:31.117 "rw_mbytes_per_sec": 0, 00:20:31.117 "r_mbytes_per_sec": 0, 00:20:31.117 "w_mbytes_per_sec": 0 00:20:31.117 }, 00:20:31.117 "claimed": true, 00:20:31.117 "claim_type": "exclusive_write", 00:20:31.117 "zoned": false, 00:20:31.117 "supported_io_types": { 00:20:31.117 "read": true, 00:20:31.117 "write": true, 00:20:31.117 "unmap": true, 00:20:31.117 "flush": true, 00:20:31.117 "reset": true, 00:20:31.117 "nvme_admin": false, 00:20:31.117 "nvme_io": false, 00:20:31.117 "nvme_io_md": false, 00:20:31.117 "write_zeroes": true, 00:20:31.117 "zcopy": true, 00:20:31.117 "get_zone_info": false, 00:20:31.117 "zone_management": false, 00:20:31.117 "zone_append": false, 00:20:31.117 "compare": false, 00:20:31.117 "compare_and_write": false, 00:20:31.117 "abort": true, 00:20:31.117 "seek_hole": false, 00:20:31.117 "seek_data": false, 00:20:31.117 "copy": true, 00:20:31.117 "nvme_iov_md": false 00:20:31.117 }, 00:20:31.117 "memory_domains": [ 00:20:31.117 { 00:20:31.117 "dma_device_id": "system", 00:20:31.117 "dma_device_type": 1 00:20:31.117 }, 00:20:31.117 { 00:20:31.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.117 "dma_device_type": 2 00:20:31.117 } 00:20:31.117 ], 00:20:31.117 "driver_specific": { 00:20:31.117 "passthru": { 00:20:31.117 "name": "pt2", 00:20:31.117 "base_bdev_name": "malloc2" 00:20:31.117 } 00:20:31.117 } 00:20:31.117 }' 00:20:31.117 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.117 09:24:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.117 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.117 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:31.375 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.634 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.634 "name": "pt3", 00:20:31.634 "aliases": [ 00:20:31.634 "00000000-0000-0000-0000-000000000003" 00:20:31.634 ], 00:20:31.634 "product_name": "passthru", 00:20:31.634 "block_size": 512, 00:20:31.634 "num_blocks": 65536, 00:20:31.634 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:31.634 "assigned_rate_limits": { 00:20:31.634 "rw_ios_per_sec": 0, 00:20:31.634 "rw_mbytes_per_sec": 0, 00:20:31.634 "r_mbytes_per_sec": 0, 00:20:31.634 "w_mbytes_per_sec": 0 00:20:31.634 }, 00:20:31.634 "claimed": true, 00:20:31.634 "claim_type": "exclusive_write", 00:20:31.634 "zoned": false, 00:20:31.634 "supported_io_types": { 00:20:31.634 "read": true, 00:20:31.634 "write": true, 00:20:31.634 "unmap": true, 00:20:31.634 "flush": true, 00:20:31.634 "reset": true, 00:20:31.634 "nvme_admin": false, 00:20:31.634 "nvme_io": false, 00:20:31.634 "nvme_io_md": false, 00:20:31.634 "write_zeroes": true, 00:20:31.634 "zcopy": true, 00:20:31.634 "get_zone_info": false, 00:20:31.634 "zone_management": false, 00:20:31.634 "zone_append": false, 00:20:31.634 "compare": false, 00:20:31.634 "compare_and_write": false, 00:20:31.634 "abort": true, 00:20:31.634 "seek_hole": false, 00:20:31.634 "seek_data": false, 00:20:31.634 "copy": true, 00:20:31.634 "nvme_iov_md": false 00:20:31.634 }, 00:20:31.634 "memory_domains": [ 00:20:31.634 { 00:20:31.634 "dma_device_id": "system", 00:20:31.634 "dma_device_type": 1 00:20:31.634 }, 00:20:31.634 { 00:20:31.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.634 "dma_device_type": 2 00:20:31.634 } 00:20:31.634 ], 00:20:31.634 "driver_specific": { 00:20:31.634 "passthru": { 00:20:31.634 "name": "pt3", 00:20:31.634 "base_bdev_name": "malloc3" 00:20:31.634 } 00:20:31.634 } 00:20:31.634 }' 00:20:31.635 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.893 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.893 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.893 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.894 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.153 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.153 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.153 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.153 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:32.153 09:24:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.411 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.411 "name": "pt4", 00:20:32.411 "aliases": [ 00:20:32.411 "00000000-0000-0000-0000-000000000004" 00:20:32.411 ], 00:20:32.411 "product_name": "passthru", 00:20:32.411 "block_size": 512, 00:20:32.411 "num_blocks": 65536, 00:20:32.411 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:32.411 "assigned_rate_limits": { 00:20:32.411 "rw_ios_per_sec": 0, 00:20:32.411 "rw_mbytes_per_sec": 0, 00:20:32.411 "r_mbytes_per_sec": 0, 00:20:32.411 "w_mbytes_per_sec": 0 00:20:32.411 }, 00:20:32.411 "claimed": true, 00:20:32.411 "claim_type": "exclusive_write", 00:20:32.411 "zoned": false, 00:20:32.411 "supported_io_types": { 00:20:32.411 "read": true, 00:20:32.411 "write": true, 00:20:32.411 "unmap": true, 00:20:32.411 "flush": true, 00:20:32.411 "reset": true, 00:20:32.411 "nvme_admin": false, 00:20:32.411 "nvme_io": false, 00:20:32.411 "nvme_io_md": false, 00:20:32.411 "write_zeroes": true, 00:20:32.411 "zcopy": true, 00:20:32.411 "get_zone_info": false, 00:20:32.411 "zone_management": false, 00:20:32.411 "zone_append": false, 00:20:32.411 "compare": false, 00:20:32.411 "compare_and_write": false, 00:20:32.411 "abort": true, 00:20:32.411 "seek_hole": false, 00:20:32.411 "seek_data": false, 00:20:32.411 "copy": true, 00:20:32.411 "nvme_iov_md": false 00:20:32.411 }, 00:20:32.411 "memory_domains": [ 00:20:32.411 { 00:20:32.411 "dma_device_id": "system", 00:20:32.412 "dma_device_type": 1 00:20:32.412 }, 00:20:32.412 { 00:20:32.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.412 "dma_device_type": 2 00:20:32.412 } 00:20:32.412 ], 00:20:32.412 "driver_specific": { 00:20:32.412 "passthru": { 00:20:32.412 "name": "pt4", 00:20:32.412 "base_bdev_name": "malloc4" 00:20:32.412 } 00:20:32.412 } 00:20:32.412 }' 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.412 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:32.696 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:32.956 [2024-07-15 09:24:41.720008] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 67726cea-7f34-4061-a338-8e92fd408473 '!=' 67726cea-7f34-4061-a338-8e92fd408473 ']' 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 169877 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 169877 ']' 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 169877 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 169877 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 169877' 00:20:32.956 killing process with pid 169877 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 169877 00:20:32.956 [2024-07-15 09:24:41.792085] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:32.956 [2024-07-15 09:24:41.792150] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:32.956 [2024-07-15 09:24:41.792213] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:32.956 [2024-07-15 09:24:41.792225] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f438f0 name raid_bdev1, state offline 00:20:32.956 09:24:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 169877 00:20:32.956 [2024-07-15 09:24:41.834908] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:33.216 09:24:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:33.216 00:20:33.216 real 0m16.290s 00:20:33.216 user 0m29.412s 00:20:33.216 sys 0m2.940s 00:20:33.216 09:24:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:33.216 09:24:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.216 ************************************ 00:20:33.216 END TEST raid_superblock_test 00:20:33.216 ************************************ 00:20:33.216 09:24:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:33.216 09:24:42 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:33.216 09:24:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:33.216 09:24:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:33.216 09:24:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:33.216 ************************************ 00:20:33.216 START TEST raid_read_error_test 00:20:33.216 ************************************ 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:33.216 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cyMO69N950 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=172333 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 172333 /var/tmp/spdk-raid.sock 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 172333 ']' 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:33.473 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:33.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:33.474 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:33.474 09:24:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.474 [2024-07-15 09:24:42.230074] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:20:33.474 [2024-07-15 09:24:42.230142] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid172333 ] 00:20:33.474 [2024-07-15 09:24:42.359918] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.731 [2024-07-15 09:24:42.466283] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.731 [2024-07-15 09:24:42.537238] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.731 [2024-07-15 09:24:42.537276] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.297 09:24:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:34.297 09:24:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:34.297 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:34.297 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:34.555 BaseBdev1_malloc 00:20:34.555 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:34.812 true 00:20:34.812 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:35.070 [2024-07-15 09:24:43.895765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:35.070 [2024-07-15 09:24:43.895809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.070 [2024-07-15 09:24:43.895830] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb090d0 00:20:35.070 [2024-07-15 09:24:43.895843] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.070 [2024-07-15 09:24:43.897737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.070 [2024-07-15 09:24:43.897765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:35.070 BaseBdev1 00:20:35.070 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:35.071 09:24:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:35.637 BaseBdev2_malloc 00:20:35.637 09:24:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:35.895 true 00:20:35.895 09:24:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:36.462 [2024-07-15 09:24:45.160917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:36.462 [2024-07-15 09:24:45.160973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.462 [2024-07-15 09:24:45.160994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb0d910 00:20:36.462 [2024-07-15 09:24:45.161006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.462 [2024-07-15 09:24:45.162559] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.462 [2024-07-15 09:24:45.162586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:36.462 BaseBdev2 00:20:36.462 09:24:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:36.462 09:24:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:36.733 BaseBdev3_malloc 00:20:36.733 09:24:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:37.043 true 00:20:37.043 09:24:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:37.301 [2024-07-15 09:24:46.176272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:37.301 [2024-07-15 09:24:46.176319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.301 [2024-07-15 09:24:46.176339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb0fbd0 00:20:37.301 [2024-07-15 09:24:46.176352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.301 [2024-07-15 09:24:46.177970] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.301 [2024-07-15 09:24:46.177998] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:37.301 BaseBdev3 00:20:37.301 09:24:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.301 09:24:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:37.869 BaseBdev4_malloc 00:20:37.869 09:24:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:38.127 true 00:20:38.127 09:24:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:38.694 [2024-07-15 09:24:47.436804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:38.694 [2024-07-15 09:24:47.436849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.694 [2024-07-15 09:24:47.436871] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb10aa0 00:20:38.694 [2024-07-15 09:24:47.436883] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.694 [2024-07-15 09:24:47.438474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.694 [2024-07-15 09:24:47.438502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:38.694 BaseBdev4 00:20:38.695 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:39.263 [2024-07-15 09:24:47.946161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:39.263 [2024-07-15 09:24:47.947531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:39.263 [2024-07-15 09:24:47.947600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:39.263 [2024-07-15 09:24:47.947662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:39.263 [2024-07-15 09:24:47.947906] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb0ac20 00:20:39.263 [2024-07-15 09:24:47.947918] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:39.263 [2024-07-15 09:24:47.948130] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x95f260 00:20:39.263 [2024-07-15 09:24:47.948285] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb0ac20 00:20:39.263 [2024-07-15 09:24:47.948295] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb0ac20 00:20:39.263 [2024-07-15 09:24:47.948403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.263 09:24:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.523 09:24:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.523 "name": "raid_bdev1", 00:20:39.523 "uuid": "39c3edea-952b-43d1-a08e-113597adae88", 00:20:39.523 "strip_size_kb": 64, 00:20:39.523 "state": "online", 00:20:39.523 "raid_level": "concat", 00:20:39.523 "superblock": true, 00:20:39.523 "num_base_bdevs": 4, 00:20:39.523 "num_base_bdevs_discovered": 4, 00:20:39.523 "num_base_bdevs_operational": 4, 00:20:39.523 "base_bdevs_list": [ 00:20:39.523 { 00:20:39.523 "name": "BaseBdev1", 00:20:39.523 "uuid": "21607e13-9c7c-5422-8d24-068b690ab7a5", 00:20:39.523 "is_configured": true, 00:20:39.523 "data_offset": 2048, 00:20:39.523 "data_size": 63488 00:20:39.523 }, 00:20:39.523 { 00:20:39.523 "name": "BaseBdev2", 00:20:39.523 "uuid": "43ddca3d-055c-5283-9f0f-747ac5b67b76", 00:20:39.523 "is_configured": true, 00:20:39.523 "data_offset": 2048, 00:20:39.523 "data_size": 63488 00:20:39.523 }, 00:20:39.523 { 00:20:39.523 "name": "BaseBdev3", 00:20:39.523 "uuid": "aa6b81ec-e2f2-56b0-8978-81944717bf44", 00:20:39.523 "is_configured": true, 00:20:39.523 "data_offset": 2048, 00:20:39.523 "data_size": 63488 00:20:39.523 }, 00:20:39.523 { 00:20:39.523 "name": "BaseBdev4", 00:20:39.523 "uuid": "fe2bb242-b27c-5db1-bbc4-4865b6000d92", 00:20:39.523 "is_configured": true, 00:20:39.523 "data_offset": 2048, 00:20:39.523 "data_size": 63488 00:20:39.523 } 00:20:39.523 ] 00:20:39.523 }' 00:20:39.523 09:24:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.523 09:24:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:40.090 09:24:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:40.090 09:24:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:40.090 [2024-07-15 09:24:48.908992] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xafcfc0 00:20:41.027 09:24:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.286 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.545 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.545 "name": "raid_bdev1", 00:20:41.545 "uuid": "39c3edea-952b-43d1-a08e-113597adae88", 00:20:41.545 "strip_size_kb": 64, 00:20:41.545 "state": "online", 00:20:41.545 "raid_level": "concat", 00:20:41.545 "superblock": true, 00:20:41.545 "num_base_bdevs": 4, 00:20:41.545 "num_base_bdevs_discovered": 4, 00:20:41.545 "num_base_bdevs_operational": 4, 00:20:41.545 "base_bdevs_list": [ 00:20:41.545 { 00:20:41.545 "name": "BaseBdev1", 00:20:41.545 "uuid": "21607e13-9c7c-5422-8d24-068b690ab7a5", 00:20:41.545 "is_configured": true, 00:20:41.545 "data_offset": 2048, 00:20:41.545 "data_size": 63488 00:20:41.545 }, 00:20:41.545 { 00:20:41.545 "name": "BaseBdev2", 00:20:41.545 "uuid": "43ddca3d-055c-5283-9f0f-747ac5b67b76", 00:20:41.545 "is_configured": true, 00:20:41.545 "data_offset": 2048, 00:20:41.545 "data_size": 63488 00:20:41.545 }, 00:20:41.545 { 00:20:41.545 "name": "BaseBdev3", 00:20:41.545 "uuid": "aa6b81ec-e2f2-56b0-8978-81944717bf44", 00:20:41.545 "is_configured": true, 00:20:41.545 "data_offset": 2048, 00:20:41.545 "data_size": 63488 00:20:41.545 }, 00:20:41.545 { 00:20:41.545 "name": "BaseBdev4", 00:20:41.545 "uuid": "fe2bb242-b27c-5db1-bbc4-4865b6000d92", 00:20:41.545 "is_configured": true, 00:20:41.545 "data_offset": 2048, 00:20:41.545 "data_size": 63488 00:20:41.545 } 00:20:41.545 ] 00:20:41.545 }' 00:20:41.545 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.545 09:24:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.113 09:24:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:42.373 [2024-07-15 09:24:51.138849] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:42.373 [2024-07-15 09:24:51.138892] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:42.373 [2024-07-15 09:24:51.142071] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:42.373 [2024-07-15 09:24:51.142110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.373 [2024-07-15 09:24:51.142152] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:42.373 [2024-07-15 09:24:51.142163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb0ac20 name raid_bdev1, state offline 00:20:42.373 0 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 172333 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 172333 ']' 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 172333 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 172333 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 172333' 00:20:42.373 killing process with pid 172333 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 172333 00:20:42.373 [2024-07-15 09:24:51.204531] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:42.373 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 172333 00:20:42.373 [2024-07-15 09:24:51.235842] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cyMO69N950 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:42.632 00:20:42.632 real 0m9.324s 00:20:42.632 user 0m15.340s 00:20:42.632 sys 0m1.604s 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:42.632 09:24:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.632 ************************************ 00:20:42.632 END TEST raid_read_error_test 00:20:42.632 ************************************ 00:20:42.632 09:24:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:42.632 09:24:51 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:42.632 09:24:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:42.632 09:24:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:42.632 09:24:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:42.632 ************************************ 00:20:42.632 START TEST raid_write_error_test 00:20:42.632 ************************************ 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:42.632 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.v2T7HQa3RA 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=173617 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 173617 /var/tmp/spdk-raid.sock 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 173617 ']' 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:42.892 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:42.892 09:24:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.892 [2024-07-15 09:24:51.645728] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:20:42.892 [2024-07-15 09:24:51.645800] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid173617 ] 00:20:42.892 [2024-07-15 09:24:51.773113] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:43.150 [2024-07-15 09:24:51.879659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:43.150 [2024-07-15 09:24:51.941660] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:43.150 [2024-07-15 09:24:51.941685] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:43.717 09:24:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:43.717 09:24:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:43.717 09:24:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:43.717 09:24:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:43.975 BaseBdev1_malloc 00:20:43.975 09:24:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:44.235 true 00:20:44.235 09:24:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:44.494 [2024-07-15 09:24:53.295397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:44.494 [2024-07-15 09:24:53.295443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:44.494 [2024-07-15 09:24:53.295465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12960d0 00:20:44.494 [2024-07-15 09:24:53.295484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:44.494 [2024-07-15 09:24:53.297409] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:44.494 [2024-07-15 09:24:53.297440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:44.494 BaseBdev1 00:20:44.494 09:24:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:44.494 09:24:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:44.753 BaseBdev2_malloc 00:20:44.753 09:24:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:45.011 true 00:20:45.011 09:24:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:45.271 [2024-07-15 09:24:54.031174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:45.271 [2024-07-15 09:24:54.031217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:45.271 [2024-07-15 09:24:54.031239] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x129a910 00:20:45.271 [2024-07-15 09:24:54.031251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:45.271 [2024-07-15 09:24:54.032830] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:45.271 [2024-07-15 09:24:54.032858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:45.271 BaseBdev2 00:20:45.271 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:45.271 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:45.530 BaseBdev3_malloc 00:20:45.530 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:45.789 true 00:20:45.789 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:46.048 [2024-07-15 09:24:54.769687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:46.048 [2024-07-15 09:24:54.769733] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.048 [2024-07-15 09:24:54.769753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x129cbd0 00:20:46.048 [2024-07-15 09:24:54.769766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.048 [2024-07-15 09:24:54.771339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.048 [2024-07-15 09:24:54.771368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:46.048 BaseBdev3 00:20:46.048 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:46.048 09:24:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:46.307 BaseBdev4_malloc 00:20:46.307 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:46.566 true 00:20:46.566 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:46.566 [2024-07-15 09:24:55.501447] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:46.566 [2024-07-15 09:24:55.501500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.566 [2024-07-15 09:24:55.501523] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x129daa0 00:20:46.566 [2024-07-15 09:24:55.501536] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.566 [2024-07-15 09:24:55.503160] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.566 [2024-07-15 09:24:55.503191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:46.566 BaseBdev4 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:46.825 [2024-07-15 09:24:55.746145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:46.825 [2024-07-15 09:24:55.747547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:46.825 [2024-07-15 09:24:55.747618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:46.825 [2024-07-15 09:24:55.747680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:46.825 [2024-07-15 09:24:55.747917] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1297c20 00:20:46.825 [2024-07-15 09:24:55.747941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:46.825 [2024-07-15 09:24:55.748140] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10ec260 00:20:46.825 [2024-07-15 09:24:55.748296] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1297c20 00:20:46.825 [2024-07-15 09:24:55.748306] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1297c20 00:20:46.825 [2024-07-15 09:24:55.748417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.825 09:24:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.084 09:24:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.084 "name": "raid_bdev1", 00:20:47.084 "uuid": "90dbb323-6cdb-4a6f-a3e8-b8c4f59b47bf", 00:20:47.084 "strip_size_kb": 64, 00:20:47.084 "state": "online", 00:20:47.084 "raid_level": "concat", 00:20:47.084 "superblock": true, 00:20:47.084 "num_base_bdevs": 4, 00:20:47.084 "num_base_bdevs_discovered": 4, 00:20:47.084 "num_base_bdevs_operational": 4, 00:20:47.084 "base_bdevs_list": [ 00:20:47.084 { 00:20:47.084 "name": "BaseBdev1", 00:20:47.084 "uuid": "03f8246e-cada-50f1-aa64-bac14baadc13", 00:20:47.084 "is_configured": true, 00:20:47.084 "data_offset": 2048, 00:20:47.084 "data_size": 63488 00:20:47.084 }, 00:20:47.084 { 00:20:47.084 "name": "BaseBdev2", 00:20:47.084 "uuid": "8491358e-a634-5148-ac59-71574e62e34f", 00:20:47.084 "is_configured": true, 00:20:47.084 "data_offset": 2048, 00:20:47.084 "data_size": 63488 00:20:47.084 }, 00:20:47.084 { 00:20:47.084 "name": "BaseBdev3", 00:20:47.084 "uuid": "2859fb7b-551a-5bd5-b65c-0423afe5d05f", 00:20:47.084 "is_configured": true, 00:20:47.084 "data_offset": 2048, 00:20:47.084 "data_size": 63488 00:20:47.084 }, 00:20:47.084 { 00:20:47.084 "name": "BaseBdev4", 00:20:47.084 "uuid": "843e4429-c1a4-5439-b12c-ea951750bb4a", 00:20:47.084 "is_configured": true, 00:20:47.084 "data_offset": 2048, 00:20:47.084 "data_size": 63488 00:20:47.084 } 00:20:47.084 ] 00:20:47.084 }' 00:20:47.084 09:24:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.084 09:24:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:48.020 09:24:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:48.020 09:24:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:48.020 [2024-07-15 09:24:56.713076] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1289fc0 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.957 09:24:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.216 09:24:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.216 "name": "raid_bdev1", 00:20:49.216 "uuid": "90dbb323-6cdb-4a6f-a3e8-b8c4f59b47bf", 00:20:49.216 "strip_size_kb": 64, 00:20:49.216 "state": "online", 00:20:49.216 "raid_level": "concat", 00:20:49.216 "superblock": true, 00:20:49.216 "num_base_bdevs": 4, 00:20:49.216 "num_base_bdevs_discovered": 4, 00:20:49.216 "num_base_bdevs_operational": 4, 00:20:49.216 "base_bdevs_list": [ 00:20:49.216 { 00:20:49.216 "name": "BaseBdev1", 00:20:49.216 "uuid": "03f8246e-cada-50f1-aa64-bac14baadc13", 00:20:49.216 "is_configured": true, 00:20:49.216 "data_offset": 2048, 00:20:49.216 "data_size": 63488 00:20:49.216 }, 00:20:49.216 { 00:20:49.216 "name": "BaseBdev2", 00:20:49.216 "uuid": "8491358e-a634-5148-ac59-71574e62e34f", 00:20:49.216 "is_configured": true, 00:20:49.216 "data_offset": 2048, 00:20:49.216 "data_size": 63488 00:20:49.216 }, 00:20:49.216 { 00:20:49.216 "name": "BaseBdev3", 00:20:49.216 "uuid": "2859fb7b-551a-5bd5-b65c-0423afe5d05f", 00:20:49.216 "is_configured": true, 00:20:49.216 "data_offset": 2048, 00:20:49.216 "data_size": 63488 00:20:49.216 }, 00:20:49.216 { 00:20:49.216 "name": "BaseBdev4", 00:20:49.216 "uuid": "843e4429-c1a4-5439-b12c-ea951750bb4a", 00:20:49.216 "is_configured": true, 00:20:49.216 "data_offset": 2048, 00:20:49.216 "data_size": 63488 00:20:49.216 } 00:20:49.216 ] 00:20:49.216 }' 00:20:49.216 09:24:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.216 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.785 09:24:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:50.044 [2024-07-15 09:24:58.954768] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:50.044 [2024-07-15 09:24:58.954817] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:50.044 [2024-07-15 09:24:58.957997] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:50.044 [2024-07-15 09:24:58.958034] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:50.044 [2024-07-15 09:24:58.958075] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:50.044 [2024-07-15 09:24:58.958087] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1297c20 name raid_bdev1, state offline 00:20:50.044 0 00:20:50.044 09:24:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 173617 00:20:50.044 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 173617 ']' 00:20:50.044 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 173617 00:20:50.044 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:50.044 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:50.045 09:24:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 173617 00:20:50.303 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:50.303 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:50.303 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 173617' 00:20:50.303 killing process with pid 173617 00:20:50.303 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 173617 00:20:50.303 [2024-07-15 09:24:59.020802] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:50.303 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 173617 00:20:50.303 [2024-07-15 09:24:59.052391] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.v2T7HQa3RA 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:50.562 00:20:50.562 real 0m7.725s 00:20:50.562 user 0m12.358s 00:20:50.562 sys 0m1.373s 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:50.562 09:24:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.562 ************************************ 00:20:50.562 END TEST raid_write_error_test 00:20:50.562 ************************************ 00:20:50.562 09:24:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:50.562 09:24:59 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:50.562 09:24:59 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:50.562 09:24:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:50.562 09:24:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:50.562 09:24:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:50.562 ************************************ 00:20:50.562 START TEST raid_state_function_test 00:20:50.562 ************************************ 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.562 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=174649 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 174649' 00:20:50.563 Process raid pid: 174649 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 174649 /var/tmp/spdk-raid.sock 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 174649 ']' 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:50.563 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:50.563 09:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.563 [2024-07-15 09:24:59.451728] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:20:50.563 [2024-07-15 09:24:59.451791] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:50.821 [2024-07-15 09:24:59.580266] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:50.821 [2024-07-15 09:24:59.682742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:50.821 [2024-07-15 09:24:59.747646] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:50.821 [2024-07-15 09:24:59.747684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:51.827 [2024-07-15 09:25:00.627160] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:51.827 [2024-07-15 09:25:00.627204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:51.827 [2024-07-15 09:25:00.627215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:51.827 [2024-07-15 09:25:00.627227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:51.827 [2024-07-15 09:25:00.627236] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:51.827 [2024-07-15 09:25:00.627248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:51.827 [2024-07-15 09:25:00.627257] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:51.827 [2024-07-15 09:25:00.627268] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.827 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.828 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.084 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.084 "name": "Existed_Raid", 00:20:52.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.084 "strip_size_kb": 0, 00:20:52.084 "state": "configuring", 00:20:52.084 "raid_level": "raid1", 00:20:52.084 "superblock": false, 00:20:52.084 "num_base_bdevs": 4, 00:20:52.084 "num_base_bdevs_discovered": 0, 00:20:52.084 "num_base_bdevs_operational": 4, 00:20:52.084 "base_bdevs_list": [ 00:20:52.084 { 00:20:52.084 "name": "BaseBdev1", 00:20:52.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.084 "is_configured": false, 00:20:52.084 "data_offset": 0, 00:20:52.084 "data_size": 0 00:20:52.084 }, 00:20:52.084 { 00:20:52.084 "name": "BaseBdev2", 00:20:52.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.084 "is_configured": false, 00:20:52.084 "data_offset": 0, 00:20:52.084 "data_size": 0 00:20:52.084 }, 00:20:52.084 { 00:20:52.084 "name": "BaseBdev3", 00:20:52.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.084 "is_configured": false, 00:20:52.084 "data_offset": 0, 00:20:52.084 "data_size": 0 00:20:52.084 }, 00:20:52.084 { 00:20:52.084 "name": "BaseBdev4", 00:20:52.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.084 "is_configured": false, 00:20:52.084 "data_offset": 0, 00:20:52.084 "data_size": 0 00:20:52.084 } 00:20:52.084 ] 00:20:52.084 }' 00:20:52.084 09:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.084 09:25:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.701 09:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:52.959 [2024-07-15 09:25:01.717939] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:52.959 [2024-07-15 09:25:01.717971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f8aa0 name Existed_Raid, state configuring 00:20:52.959 09:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:53.217 [2024-07-15 09:25:01.966590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:53.217 [2024-07-15 09:25:01.966622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:53.217 [2024-07-15 09:25:01.966632] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:53.217 [2024-07-15 09:25:01.966644] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:53.217 [2024-07-15 09:25:01.966653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:53.217 [2024-07-15 09:25:01.966664] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:53.217 [2024-07-15 09:25:01.966673] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:53.217 [2024-07-15 09:25:01.966684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:53.217 09:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:53.473 [2024-07-15 09:25:02.221108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:53.473 BaseBdev1 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:53.473 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.731 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:53.989 [ 00:20:53.989 { 00:20:53.989 "name": "BaseBdev1", 00:20:53.989 "aliases": [ 00:20:53.989 "d6daea9d-a45f-49a7-8952-72501272c3ef" 00:20:53.989 ], 00:20:53.989 "product_name": "Malloc disk", 00:20:53.989 "block_size": 512, 00:20:53.989 "num_blocks": 65536, 00:20:53.989 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:20:53.989 "assigned_rate_limits": { 00:20:53.989 "rw_ios_per_sec": 0, 00:20:53.989 "rw_mbytes_per_sec": 0, 00:20:53.989 "r_mbytes_per_sec": 0, 00:20:53.989 "w_mbytes_per_sec": 0 00:20:53.989 }, 00:20:53.989 "claimed": true, 00:20:53.989 "claim_type": "exclusive_write", 00:20:53.989 "zoned": false, 00:20:53.989 "supported_io_types": { 00:20:53.989 "read": true, 00:20:53.989 "write": true, 00:20:53.989 "unmap": true, 00:20:53.989 "flush": true, 00:20:53.989 "reset": true, 00:20:53.989 "nvme_admin": false, 00:20:53.989 "nvme_io": false, 00:20:53.989 "nvme_io_md": false, 00:20:53.989 "write_zeroes": true, 00:20:53.989 "zcopy": true, 00:20:53.989 "get_zone_info": false, 00:20:53.989 "zone_management": false, 00:20:53.989 "zone_append": false, 00:20:53.989 "compare": false, 00:20:53.989 "compare_and_write": false, 00:20:53.989 "abort": true, 00:20:53.989 "seek_hole": false, 00:20:53.989 "seek_data": false, 00:20:53.989 "copy": true, 00:20:53.989 "nvme_iov_md": false 00:20:53.989 }, 00:20:53.989 "memory_domains": [ 00:20:53.989 { 00:20:53.989 "dma_device_id": "system", 00:20:53.989 "dma_device_type": 1 00:20:53.989 }, 00:20:53.989 { 00:20:53.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.989 "dma_device_type": 2 00:20:53.989 } 00:20:53.989 ], 00:20:53.989 "driver_specific": {} 00:20:53.989 } 00:20:53.989 ] 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.989 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.247 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.247 "name": "Existed_Raid", 00:20:54.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.247 "strip_size_kb": 0, 00:20:54.247 "state": "configuring", 00:20:54.247 "raid_level": "raid1", 00:20:54.247 "superblock": false, 00:20:54.247 "num_base_bdevs": 4, 00:20:54.247 "num_base_bdevs_discovered": 1, 00:20:54.247 "num_base_bdevs_operational": 4, 00:20:54.247 "base_bdevs_list": [ 00:20:54.247 { 00:20:54.247 "name": "BaseBdev1", 00:20:54.247 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:20:54.247 "is_configured": true, 00:20:54.247 "data_offset": 0, 00:20:54.247 "data_size": 65536 00:20:54.247 }, 00:20:54.247 { 00:20:54.247 "name": "BaseBdev2", 00:20:54.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.247 "is_configured": false, 00:20:54.247 "data_offset": 0, 00:20:54.247 "data_size": 0 00:20:54.247 }, 00:20:54.247 { 00:20:54.247 "name": "BaseBdev3", 00:20:54.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.247 "is_configured": false, 00:20:54.247 "data_offset": 0, 00:20:54.247 "data_size": 0 00:20:54.247 }, 00:20:54.247 { 00:20:54.247 "name": "BaseBdev4", 00:20:54.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.247 "is_configured": false, 00:20:54.247 "data_offset": 0, 00:20:54.247 "data_size": 0 00:20:54.247 } 00:20:54.247 ] 00:20:54.247 }' 00:20:54.247 09:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.247 09:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.812 09:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:55.070 [2024-07-15 09:25:03.809333] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:55.070 [2024-07-15 09:25:03.809373] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f8310 name Existed_Raid, state configuring 00:20:55.070 09:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:55.329 [2024-07-15 09:25:04.054009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.329 [2024-07-15 09:25:04.055493] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:55.329 [2024-07-15 09:25:04.055527] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:55.329 [2024-07-15 09:25:04.055537] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:55.329 [2024-07-15 09:25:04.055554] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:55.329 [2024-07-15 09:25:04.055564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:55.329 [2024-07-15 09:25:04.055575] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.329 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.588 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.588 "name": "Existed_Raid", 00:20:55.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.588 "strip_size_kb": 0, 00:20:55.588 "state": "configuring", 00:20:55.589 "raid_level": "raid1", 00:20:55.589 "superblock": false, 00:20:55.589 "num_base_bdevs": 4, 00:20:55.589 "num_base_bdevs_discovered": 1, 00:20:55.589 "num_base_bdevs_operational": 4, 00:20:55.589 "base_bdevs_list": [ 00:20:55.589 { 00:20:55.589 "name": "BaseBdev1", 00:20:55.589 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:20:55.589 "is_configured": true, 00:20:55.589 "data_offset": 0, 00:20:55.589 "data_size": 65536 00:20:55.589 }, 00:20:55.589 { 00:20:55.589 "name": "BaseBdev2", 00:20:55.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.589 "is_configured": false, 00:20:55.589 "data_offset": 0, 00:20:55.589 "data_size": 0 00:20:55.589 }, 00:20:55.589 { 00:20:55.589 "name": "BaseBdev3", 00:20:55.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.589 "is_configured": false, 00:20:55.589 "data_offset": 0, 00:20:55.589 "data_size": 0 00:20:55.589 }, 00:20:55.589 { 00:20:55.589 "name": "BaseBdev4", 00:20:55.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.589 "is_configured": false, 00:20:55.589 "data_offset": 0, 00:20:55.589 "data_size": 0 00:20:55.589 } 00:20:55.589 ] 00:20:55.589 }' 00:20:55.589 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.589 09:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.156 09:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:56.413 [2024-07-15 09:25:05.128218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:56.413 BaseBdev2 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:56.413 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.671 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:56.671 [ 00:20:56.671 { 00:20:56.671 "name": "BaseBdev2", 00:20:56.671 "aliases": [ 00:20:56.671 "c29c19d8-79bc-4301-813c-41d90ae3fb86" 00:20:56.671 ], 00:20:56.671 "product_name": "Malloc disk", 00:20:56.671 "block_size": 512, 00:20:56.671 "num_blocks": 65536, 00:20:56.671 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:20:56.671 "assigned_rate_limits": { 00:20:56.671 "rw_ios_per_sec": 0, 00:20:56.671 "rw_mbytes_per_sec": 0, 00:20:56.671 "r_mbytes_per_sec": 0, 00:20:56.671 "w_mbytes_per_sec": 0 00:20:56.671 }, 00:20:56.671 "claimed": true, 00:20:56.671 "claim_type": "exclusive_write", 00:20:56.671 "zoned": false, 00:20:56.671 "supported_io_types": { 00:20:56.671 "read": true, 00:20:56.671 "write": true, 00:20:56.671 "unmap": true, 00:20:56.671 "flush": true, 00:20:56.671 "reset": true, 00:20:56.671 "nvme_admin": false, 00:20:56.671 "nvme_io": false, 00:20:56.671 "nvme_io_md": false, 00:20:56.671 "write_zeroes": true, 00:20:56.671 "zcopy": true, 00:20:56.671 "get_zone_info": false, 00:20:56.671 "zone_management": false, 00:20:56.671 "zone_append": false, 00:20:56.671 "compare": false, 00:20:56.671 "compare_and_write": false, 00:20:56.671 "abort": true, 00:20:56.671 "seek_hole": false, 00:20:56.671 "seek_data": false, 00:20:56.671 "copy": true, 00:20:56.671 "nvme_iov_md": false 00:20:56.671 }, 00:20:56.671 "memory_domains": [ 00:20:56.671 { 00:20:56.671 "dma_device_id": "system", 00:20:56.671 "dma_device_type": 1 00:20:56.671 }, 00:20:56.671 { 00:20:56.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.671 "dma_device_type": 2 00:20:56.671 } 00:20:56.671 ], 00:20:56.671 "driver_specific": {} 00:20:56.671 } 00:20:56.671 ] 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.930 "name": "Existed_Raid", 00:20:56.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.930 "strip_size_kb": 0, 00:20:56.930 "state": "configuring", 00:20:56.930 "raid_level": "raid1", 00:20:56.930 "superblock": false, 00:20:56.930 "num_base_bdevs": 4, 00:20:56.930 "num_base_bdevs_discovered": 2, 00:20:56.930 "num_base_bdevs_operational": 4, 00:20:56.930 "base_bdevs_list": [ 00:20:56.930 { 00:20:56.930 "name": "BaseBdev1", 00:20:56.930 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:20:56.930 "is_configured": true, 00:20:56.930 "data_offset": 0, 00:20:56.930 "data_size": 65536 00:20:56.930 }, 00:20:56.930 { 00:20:56.930 "name": "BaseBdev2", 00:20:56.930 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:20:56.930 "is_configured": true, 00:20:56.930 "data_offset": 0, 00:20:56.930 "data_size": 65536 00:20:56.930 }, 00:20:56.930 { 00:20:56.930 "name": "BaseBdev3", 00:20:56.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.930 "is_configured": false, 00:20:56.930 "data_offset": 0, 00:20:56.930 "data_size": 0 00:20:56.930 }, 00:20:56.930 { 00:20:56.930 "name": "BaseBdev4", 00:20:56.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.930 "is_configured": false, 00:20:56.930 "data_offset": 0, 00:20:56.930 "data_size": 0 00:20:56.930 } 00:20:56.930 ] 00:20:56.930 }' 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.930 09:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:57.866 [2024-07-15 09:25:06.703817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:57.866 BaseBdev3 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:57.866 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:58.125 09:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:58.384 [ 00:20:58.384 { 00:20:58.384 "name": "BaseBdev3", 00:20:58.384 "aliases": [ 00:20:58.384 "c8a371a0-1abe-472b-9fd0-eaa764b257f9" 00:20:58.384 ], 00:20:58.384 "product_name": "Malloc disk", 00:20:58.384 "block_size": 512, 00:20:58.384 "num_blocks": 65536, 00:20:58.384 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:20:58.385 "assigned_rate_limits": { 00:20:58.385 "rw_ios_per_sec": 0, 00:20:58.385 "rw_mbytes_per_sec": 0, 00:20:58.385 "r_mbytes_per_sec": 0, 00:20:58.385 "w_mbytes_per_sec": 0 00:20:58.385 }, 00:20:58.385 "claimed": true, 00:20:58.385 "claim_type": "exclusive_write", 00:20:58.385 "zoned": false, 00:20:58.385 "supported_io_types": { 00:20:58.385 "read": true, 00:20:58.385 "write": true, 00:20:58.385 "unmap": true, 00:20:58.385 "flush": true, 00:20:58.385 "reset": true, 00:20:58.385 "nvme_admin": false, 00:20:58.385 "nvme_io": false, 00:20:58.385 "nvme_io_md": false, 00:20:58.385 "write_zeroes": true, 00:20:58.385 "zcopy": true, 00:20:58.385 "get_zone_info": false, 00:20:58.385 "zone_management": false, 00:20:58.385 "zone_append": false, 00:20:58.385 "compare": false, 00:20:58.385 "compare_and_write": false, 00:20:58.385 "abort": true, 00:20:58.385 "seek_hole": false, 00:20:58.385 "seek_data": false, 00:20:58.385 "copy": true, 00:20:58.385 "nvme_iov_md": false 00:20:58.385 }, 00:20:58.385 "memory_domains": [ 00:20:58.385 { 00:20:58.385 "dma_device_id": "system", 00:20:58.385 "dma_device_type": 1 00:20:58.385 }, 00:20:58.385 { 00:20:58.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.385 "dma_device_type": 2 00:20:58.385 } 00:20:58.385 ], 00:20:58.385 "driver_specific": {} 00:20:58.385 } 00:20:58.385 ] 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.385 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.645 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.645 "name": "Existed_Raid", 00:20:58.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.645 "strip_size_kb": 0, 00:20:58.645 "state": "configuring", 00:20:58.645 "raid_level": "raid1", 00:20:58.645 "superblock": false, 00:20:58.645 "num_base_bdevs": 4, 00:20:58.645 "num_base_bdevs_discovered": 3, 00:20:58.645 "num_base_bdevs_operational": 4, 00:20:58.645 "base_bdevs_list": [ 00:20:58.645 { 00:20:58.645 "name": "BaseBdev1", 00:20:58.645 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:20:58.645 "is_configured": true, 00:20:58.645 "data_offset": 0, 00:20:58.645 "data_size": 65536 00:20:58.645 }, 00:20:58.645 { 00:20:58.645 "name": "BaseBdev2", 00:20:58.645 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:20:58.645 "is_configured": true, 00:20:58.645 "data_offset": 0, 00:20:58.645 "data_size": 65536 00:20:58.645 }, 00:20:58.645 { 00:20:58.645 "name": "BaseBdev3", 00:20:58.645 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:20:58.645 "is_configured": true, 00:20:58.645 "data_offset": 0, 00:20:58.645 "data_size": 65536 00:20:58.645 }, 00:20:58.645 { 00:20:58.645 "name": "BaseBdev4", 00:20:58.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.645 "is_configured": false, 00:20:58.645 "data_offset": 0, 00:20:58.645 "data_size": 0 00:20:58.645 } 00:20:58.645 ] 00:20:58.645 }' 00:20:58.645 09:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.645 09:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:59.212 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:59.472 [2024-07-15 09:25:08.300566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:59.472 [2024-07-15 09:25:08.300608] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8f9350 00:20:59.472 [2024-07-15 09:25:08.300618] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:59.472 [2024-07-15 09:25:08.300875] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f9020 00:20:59.472 [2024-07-15 09:25:08.301019] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8f9350 00:20:59.472 [2024-07-15 09:25:08.301030] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8f9350 00:20:59.472 [2024-07-15 09:25:08.301197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.472 BaseBdev4 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.472 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.731 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:59.990 [ 00:20:59.990 { 00:20:59.990 "name": "BaseBdev4", 00:20:59.990 "aliases": [ 00:20:59.990 "03e9b104-1002-4f33-8f70-18fafb4e5eff" 00:20:59.990 ], 00:20:59.990 "product_name": "Malloc disk", 00:20:59.990 "block_size": 512, 00:20:59.990 "num_blocks": 65536, 00:20:59.990 "uuid": "03e9b104-1002-4f33-8f70-18fafb4e5eff", 00:20:59.990 "assigned_rate_limits": { 00:20:59.990 "rw_ios_per_sec": 0, 00:20:59.990 "rw_mbytes_per_sec": 0, 00:20:59.990 "r_mbytes_per_sec": 0, 00:20:59.990 "w_mbytes_per_sec": 0 00:20:59.990 }, 00:20:59.990 "claimed": true, 00:20:59.990 "claim_type": "exclusive_write", 00:20:59.990 "zoned": false, 00:20:59.990 "supported_io_types": { 00:20:59.990 "read": true, 00:20:59.990 "write": true, 00:20:59.990 "unmap": true, 00:20:59.990 "flush": true, 00:20:59.990 "reset": true, 00:20:59.990 "nvme_admin": false, 00:20:59.990 "nvme_io": false, 00:20:59.990 "nvme_io_md": false, 00:20:59.990 "write_zeroes": true, 00:20:59.990 "zcopy": true, 00:20:59.990 "get_zone_info": false, 00:20:59.990 "zone_management": false, 00:20:59.990 "zone_append": false, 00:20:59.990 "compare": false, 00:20:59.990 "compare_and_write": false, 00:20:59.990 "abort": true, 00:20:59.990 "seek_hole": false, 00:20:59.990 "seek_data": false, 00:20:59.990 "copy": true, 00:20:59.990 "nvme_iov_md": false 00:20:59.990 }, 00:20:59.990 "memory_domains": [ 00:20:59.990 { 00:20:59.990 "dma_device_id": "system", 00:20:59.990 "dma_device_type": 1 00:20:59.990 }, 00:20:59.990 { 00:20:59.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.990 "dma_device_type": 2 00:20:59.990 } 00:20:59.990 ], 00:20:59.990 "driver_specific": {} 00:20:59.990 } 00:20:59.990 ] 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.990 09:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.249 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.249 "name": "Existed_Raid", 00:21:00.249 "uuid": "70738bec-3fe9-4e57-8161-cc4c80ff5f29", 00:21:00.249 "strip_size_kb": 0, 00:21:00.249 "state": "online", 00:21:00.249 "raid_level": "raid1", 00:21:00.249 "superblock": false, 00:21:00.249 "num_base_bdevs": 4, 00:21:00.249 "num_base_bdevs_discovered": 4, 00:21:00.249 "num_base_bdevs_operational": 4, 00:21:00.249 "base_bdevs_list": [ 00:21:00.249 { 00:21:00.249 "name": "BaseBdev1", 00:21:00.249 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:21:00.249 "is_configured": true, 00:21:00.249 "data_offset": 0, 00:21:00.249 "data_size": 65536 00:21:00.249 }, 00:21:00.249 { 00:21:00.249 "name": "BaseBdev2", 00:21:00.249 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:21:00.249 "is_configured": true, 00:21:00.249 "data_offset": 0, 00:21:00.249 "data_size": 65536 00:21:00.249 }, 00:21:00.249 { 00:21:00.249 "name": "BaseBdev3", 00:21:00.249 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:21:00.249 "is_configured": true, 00:21:00.249 "data_offset": 0, 00:21:00.249 "data_size": 65536 00:21:00.249 }, 00:21:00.249 { 00:21:00.249 "name": "BaseBdev4", 00:21:00.249 "uuid": "03e9b104-1002-4f33-8f70-18fafb4e5eff", 00:21:00.249 "is_configured": true, 00:21:00.249 "data_offset": 0, 00:21:00.249 "data_size": 65536 00:21:00.249 } 00:21:00.249 ] 00:21:00.249 }' 00:21:00.249 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.249 09:25:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:00.816 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:01.074 [2024-07-15 09:25:09.800885] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:01.074 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:01.074 "name": "Existed_Raid", 00:21:01.074 "aliases": [ 00:21:01.074 "70738bec-3fe9-4e57-8161-cc4c80ff5f29" 00:21:01.074 ], 00:21:01.074 "product_name": "Raid Volume", 00:21:01.074 "block_size": 512, 00:21:01.074 "num_blocks": 65536, 00:21:01.074 "uuid": "70738bec-3fe9-4e57-8161-cc4c80ff5f29", 00:21:01.074 "assigned_rate_limits": { 00:21:01.074 "rw_ios_per_sec": 0, 00:21:01.074 "rw_mbytes_per_sec": 0, 00:21:01.074 "r_mbytes_per_sec": 0, 00:21:01.074 "w_mbytes_per_sec": 0 00:21:01.074 }, 00:21:01.074 "claimed": false, 00:21:01.074 "zoned": false, 00:21:01.074 "supported_io_types": { 00:21:01.074 "read": true, 00:21:01.074 "write": true, 00:21:01.074 "unmap": false, 00:21:01.074 "flush": false, 00:21:01.074 "reset": true, 00:21:01.074 "nvme_admin": false, 00:21:01.074 "nvme_io": false, 00:21:01.074 "nvme_io_md": false, 00:21:01.074 "write_zeroes": true, 00:21:01.074 "zcopy": false, 00:21:01.074 "get_zone_info": false, 00:21:01.074 "zone_management": false, 00:21:01.074 "zone_append": false, 00:21:01.074 "compare": false, 00:21:01.074 "compare_and_write": false, 00:21:01.074 "abort": false, 00:21:01.074 "seek_hole": false, 00:21:01.074 "seek_data": false, 00:21:01.074 "copy": false, 00:21:01.074 "nvme_iov_md": false 00:21:01.074 }, 00:21:01.074 "memory_domains": [ 00:21:01.074 { 00:21:01.074 "dma_device_id": "system", 00:21:01.074 "dma_device_type": 1 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.074 "dma_device_type": 2 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "system", 00:21:01.074 "dma_device_type": 1 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.074 "dma_device_type": 2 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "system", 00:21:01.074 "dma_device_type": 1 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.074 "dma_device_type": 2 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "system", 00:21:01.074 "dma_device_type": 1 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.074 "dma_device_type": 2 00:21:01.074 } 00:21:01.074 ], 00:21:01.074 "driver_specific": { 00:21:01.074 "raid": { 00:21:01.074 "uuid": "70738bec-3fe9-4e57-8161-cc4c80ff5f29", 00:21:01.074 "strip_size_kb": 0, 00:21:01.074 "state": "online", 00:21:01.074 "raid_level": "raid1", 00:21:01.074 "superblock": false, 00:21:01.074 "num_base_bdevs": 4, 00:21:01.074 "num_base_bdevs_discovered": 4, 00:21:01.074 "num_base_bdevs_operational": 4, 00:21:01.074 "base_bdevs_list": [ 00:21:01.074 { 00:21:01.074 "name": "BaseBdev1", 00:21:01.074 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:21:01.074 "is_configured": true, 00:21:01.074 "data_offset": 0, 00:21:01.074 "data_size": 65536 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "name": "BaseBdev2", 00:21:01.074 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:21:01.074 "is_configured": true, 00:21:01.074 "data_offset": 0, 00:21:01.074 "data_size": 65536 00:21:01.074 }, 00:21:01.074 { 00:21:01.074 "name": "BaseBdev3", 00:21:01.074 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:21:01.074 "is_configured": true, 00:21:01.074 "data_offset": 0, 00:21:01.074 "data_size": 65536 00:21:01.075 }, 00:21:01.075 { 00:21:01.075 "name": "BaseBdev4", 00:21:01.075 "uuid": "03e9b104-1002-4f33-8f70-18fafb4e5eff", 00:21:01.075 "is_configured": true, 00:21:01.075 "data_offset": 0, 00:21:01.075 "data_size": 65536 00:21:01.075 } 00:21:01.075 ] 00:21:01.075 } 00:21:01.075 } 00:21:01.075 }' 00:21:01.075 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:01.075 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:01.075 BaseBdev2 00:21:01.075 BaseBdev3 00:21:01.075 BaseBdev4' 00:21:01.075 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.075 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:01.075 09:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.333 "name": "BaseBdev1", 00:21:01.333 "aliases": [ 00:21:01.333 "d6daea9d-a45f-49a7-8952-72501272c3ef" 00:21:01.333 ], 00:21:01.333 "product_name": "Malloc disk", 00:21:01.333 "block_size": 512, 00:21:01.333 "num_blocks": 65536, 00:21:01.333 "uuid": "d6daea9d-a45f-49a7-8952-72501272c3ef", 00:21:01.333 "assigned_rate_limits": { 00:21:01.333 "rw_ios_per_sec": 0, 00:21:01.333 "rw_mbytes_per_sec": 0, 00:21:01.333 "r_mbytes_per_sec": 0, 00:21:01.333 "w_mbytes_per_sec": 0 00:21:01.333 }, 00:21:01.333 "claimed": true, 00:21:01.333 "claim_type": "exclusive_write", 00:21:01.333 "zoned": false, 00:21:01.333 "supported_io_types": { 00:21:01.333 "read": true, 00:21:01.333 "write": true, 00:21:01.333 "unmap": true, 00:21:01.333 "flush": true, 00:21:01.333 "reset": true, 00:21:01.333 "nvme_admin": false, 00:21:01.333 "nvme_io": false, 00:21:01.333 "nvme_io_md": false, 00:21:01.333 "write_zeroes": true, 00:21:01.333 "zcopy": true, 00:21:01.333 "get_zone_info": false, 00:21:01.333 "zone_management": false, 00:21:01.333 "zone_append": false, 00:21:01.333 "compare": false, 00:21:01.333 "compare_and_write": false, 00:21:01.333 "abort": true, 00:21:01.333 "seek_hole": false, 00:21:01.333 "seek_data": false, 00:21:01.333 "copy": true, 00:21:01.333 "nvme_iov_md": false 00:21:01.333 }, 00:21:01.333 "memory_domains": [ 00:21:01.333 { 00:21:01.333 "dma_device_id": "system", 00:21:01.333 "dma_device_type": 1 00:21:01.333 }, 00:21:01.333 { 00:21:01.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.333 "dma_device_type": 2 00:21:01.333 } 00:21:01.333 ], 00:21:01.333 "driver_specific": {} 00:21:01.333 }' 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.333 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:01.592 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.851 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.851 "name": "BaseBdev2", 00:21:01.851 "aliases": [ 00:21:01.851 "c29c19d8-79bc-4301-813c-41d90ae3fb86" 00:21:01.851 ], 00:21:01.851 "product_name": "Malloc disk", 00:21:01.851 "block_size": 512, 00:21:01.851 "num_blocks": 65536, 00:21:01.851 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:21:01.851 "assigned_rate_limits": { 00:21:01.851 "rw_ios_per_sec": 0, 00:21:01.851 "rw_mbytes_per_sec": 0, 00:21:01.851 "r_mbytes_per_sec": 0, 00:21:01.851 "w_mbytes_per_sec": 0 00:21:01.851 }, 00:21:01.851 "claimed": true, 00:21:01.851 "claim_type": "exclusive_write", 00:21:01.851 "zoned": false, 00:21:01.851 "supported_io_types": { 00:21:01.851 "read": true, 00:21:01.851 "write": true, 00:21:01.851 "unmap": true, 00:21:01.851 "flush": true, 00:21:01.851 "reset": true, 00:21:01.851 "nvme_admin": false, 00:21:01.851 "nvme_io": false, 00:21:01.851 "nvme_io_md": false, 00:21:01.851 "write_zeroes": true, 00:21:01.851 "zcopy": true, 00:21:01.851 "get_zone_info": false, 00:21:01.851 "zone_management": false, 00:21:01.851 "zone_append": false, 00:21:01.851 "compare": false, 00:21:01.851 "compare_and_write": false, 00:21:01.851 "abort": true, 00:21:01.851 "seek_hole": false, 00:21:01.851 "seek_data": false, 00:21:01.851 "copy": true, 00:21:01.851 "nvme_iov_md": false 00:21:01.851 }, 00:21:01.851 "memory_domains": [ 00:21:01.851 { 00:21:01.851 "dma_device_id": "system", 00:21:01.851 "dma_device_type": 1 00:21:01.851 }, 00:21:01.851 { 00:21:01.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.851 "dma_device_type": 2 00:21:01.851 } 00:21:01.851 ], 00:21:01.851 "driver_specific": {} 00:21:01.851 }' 00:21:01.851 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.851 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.851 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.851 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.109 09:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.109 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.109 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.109 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.109 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:02.109 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.368 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.368 "name": "BaseBdev3", 00:21:02.368 "aliases": [ 00:21:02.368 "c8a371a0-1abe-472b-9fd0-eaa764b257f9" 00:21:02.368 ], 00:21:02.368 "product_name": "Malloc disk", 00:21:02.368 "block_size": 512, 00:21:02.368 "num_blocks": 65536, 00:21:02.368 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:21:02.368 "assigned_rate_limits": { 00:21:02.368 "rw_ios_per_sec": 0, 00:21:02.368 "rw_mbytes_per_sec": 0, 00:21:02.368 "r_mbytes_per_sec": 0, 00:21:02.368 "w_mbytes_per_sec": 0 00:21:02.368 }, 00:21:02.368 "claimed": true, 00:21:02.368 "claim_type": "exclusive_write", 00:21:02.368 "zoned": false, 00:21:02.368 "supported_io_types": { 00:21:02.368 "read": true, 00:21:02.368 "write": true, 00:21:02.368 "unmap": true, 00:21:02.368 "flush": true, 00:21:02.368 "reset": true, 00:21:02.368 "nvme_admin": false, 00:21:02.368 "nvme_io": false, 00:21:02.368 "nvme_io_md": false, 00:21:02.368 "write_zeroes": true, 00:21:02.368 "zcopy": true, 00:21:02.368 "get_zone_info": false, 00:21:02.368 "zone_management": false, 00:21:02.368 "zone_append": false, 00:21:02.368 "compare": false, 00:21:02.368 "compare_and_write": false, 00:21:02.368 "abort": true, 00:21:02.368 "seek_hole": false, 00:21:02.368 "seek_data": false, 00:21:02.368 "copy": true, 00:21:02.368 "nvme_iov_md": false 00:21:02.368 }, 00:21:02.368 "memory_domains": [ 00:21:02.368 { 00:21:02.368 "dma_device_id": "system", 00:21:02.368 "dma_device_type": 1 00:21:02.368 }, 00:21:02.368 { 00:21:02.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.368 "dma_device_type": 2 00:21:02.368 } 00:21:02.368 ], 00:21:02.368 "driver_specific": {} 00:21:02.368 }' 00:21:02.368 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.627 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.886 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.886 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.886 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.886 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:02.886 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.145 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.145 "name": "BaseBdev4", 00:21:03.145 "aliases": [ 00:21:03.145 "03e9b104-1002-4f33-8f70-18fafb4e5eff" 00:21:03.145 ], 00:21:03.145 "product_name": "Malloc disk", 00:21:03.145 "block_size": 512, 00:21:03.145 "num_blocks": 65536, 00:21:03.145 "uuid": "03e9b104-1002-4f33-8f70-18fafb4e5eff", 00:21:03.145 "assigned_rate_limits": { 00:21:03.145 "rw_ios_per_sec": 0, 00:21:03.145 "rw_mbytes_per_sec": 0, 00:21:03.145 "r_mbytes_per_sec": 0, 00:21:03.145 "w_mbytes_per_sec": 0 00:21:03.145 }, 00:21:03.145 "claimed": true, 00:21:03.145 "claim_type": "exclusive_write", 00:21:03.145 "zoned": false, 00:21:03.145 "supported_io_types": { 00:21:03.145 "read": true, 00:21:03.145 "write": true, 00:21:03.145 "unmap": true, 00:21:03.145 "flush": true, 00:21:03.145 "reset": true, 00:21:03.145 "nvme_admin": false, 00:21:03.145 "nvme_io": false, 00:21:03.145 "nvme_io_md": false, 00:21:03.145 "write_zeroes": true, 00:21:03.145 "zcopy": true, 00:21:03.145 "get_zone_info": false, 00:21:03.145 "zone_management": false, 00:21:03.145 "zone_append": false, 00:21:03.145 "compare": false, 00:21:03.145 "compare_and_write": false, 00:21:03.145 "abort": true, 00:21:03.145 "seek_hole": false, 00:21:03.145 "seek_data": false, 00:21:03.145 "copy": true, 00:21:03.145 "nvme_iov_md": false 00:21:03.145 }, 00:21:03.145 "memory_domains": [ 00:21:03.145 { 00:21:03.145 "dma_device_id": "system", 00:21:03.145 "dma_device_type": 1 00:21:03.145 }, 00:21:03.145 { 00:21:03.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.145 "dma_device_type": 2 00:21:03.145 } 00:21:03.145 ], 00:21:03.145 "driver_specific": {} 00:21:03.145 }' 00:21:03.145 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.145 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.145 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.145 09:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.145 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.145 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.145 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.404 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:03.662 [2024-07-15 09:25:12.471676] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.662 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.663 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.921 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.921 "name": "Existed_Raid", 00:21:03.921 "uuid": "70738bec-3fe9-4e57-8161-cc4c80ff5f29", 00:21:03.921 "strip_size_kb": 0, 00:21:03.921 "state": "online", 00:21:03.921 "raid_level": "raid1", 00:21:03.921 "superblock": false, 00:21:03.921 "num_base_bdevs": 4, 00:21:03.921 "num_base_bdevs_discovered": 3, 00:21:03.921 "num_base_bdevs_operational": 3, 00:21:03.921 "base_bdevs_list": [ 00:21:03.921 { 00:21:03.921 "name": null, 00:21:03.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.921 "is_configured": false, 00:21:03.921 "data_offset": 0, 00:21:03.921 "data_size": 65536 00:21:03.921 }, 00:21:03.921 { 00:21:03.921 "name": "BaseBdev2", 00:21:03.921 "uuid": "c29c19d8-79bc-4301-813c-41d90ae3fb86", 00:21:03.921 "is_configured": true, 00:21:03.921 "data_offset": 0, 00:21:03.921 "data_size": 65536 00:21:03.921 }, 00:21:03.921 { 00:21:03.921 "name": "BaseBdev3", 00:21:03.921 "uuid": "c8a371a0-1abe-472b-9fd0-eaa764b257f9", 00:21:03.921 "is_configured": true, 00:21:03.921 "data_offset": 0, 00:21:03.921 "data_size": 65536 00:21:03.921 }, 00:21:03.921 { 00:21:03.921 "name": "BaseBdev4", 00:21:03.921 "uuid": "03e9b104-1002-4f33-8f70-18fafb4e5eff", 00:21:03.921 "is_configured": true, 00:21:03.921 "data_offset": 0, 00:21:03.921 "data_size": 65536 00:21:03.921 } 00:21:03.921 ] 00:21:03.921 }' 00:21:03.921 09:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.921 09:25:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.488 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:04.488 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.488 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.488 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:04.746 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:04.746 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:04.746 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:05.004 [2024-07-15 09:25:13.824315] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:05.004 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.004 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.004 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.004 09:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.263 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.264 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.264 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:05.522 [2024-07-15 09:25:14.329945] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:05.522 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.522 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.522 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.522 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.813 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.813 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.813 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:06.081 [2024-07-15 09:25:14.799709] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:06.081 [2024-07-15 09:25:14.799804] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:06.081 [2024-07-15 09:25:14.820611] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:06.081 [2024-07-15 09:25:14.820649] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:06.081 [2024-07-15 09:25:14.820661] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f9350 name Existed_Raid, state offline 00:21:06.081 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:06.081 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:06.081 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.081 09:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:06.081 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:06.340 BaseBdev2 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.340 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.599 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:06.859 [ 00:21:06.859 { 00:21:06.859 "name": "BaseBdev2", 00:21:06.859 "aliases": [ 00:21:06.859 "f9bb630a-2d7b-480d-a5c7-44992b877f5b" 00:21:06.859 ], 00:21:06.859 "product_name": "Malloc disk", 00:21:06.859 "block_size": 512, 00:21:06.859 "num_blocks": 65536, 00:21:06.859 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:06.859 "assigned_rate_limits": { 00:21:06.859 "rw_ios_per_sec": 0, 00:21:06.859 "rw_mbytes_per_sec": 0, 00:21:06.859 "r_mbytes_per_sec": 0, 00:21:06.859 "w_mbytes_per_sec": 0 00:21:06.859 }, 00:21:06.859 "claimed": false, 00:21:06.859 "zoned": false, 00:21:06.859 "supported_io_types": { 00:21:06.859 "read": true, 00:21:06.859 "write": true, 00:21:06.859 "unmap": true, 00:21:06.859 "flush": true, 00:21:06.859 "reset": true, 00:21:06.859 "nvme_admin": false, 00:21:06.859 "nvme_io": false, 00:21:06.859 "nvme_io_md": false, 00:21:06.859 "write_zeroes": true, 00:21:06.859 "zcopy": true, 00:21:06.859 "get_zone_info": false, 00:21:06.859 "zone_management": false, 00:21:06.859 "zone_append": false, 00:21:06.859 "compare": false, 00:21:06.859 "compare_and_write": false, 00:21:06.859 "abort": true, 00:21:06.859 "seek_hole": false, 00:21:06.859 "seek_data": false, 00:21:06.859 "copy": true, 00:21:06.859 "nvme_iov_md": false 00:21:06.859 }, 00:21:06.859 "memory_domains": [ 00:21:06.859 { 00:21:06.859 "dma_device_id": "system", 00:21:06.859 "dma_device_type": 1 00:21:06.859 }, 00:21:06.859 { 00:21:06.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.859 "dma_device_type": 2 00:21:06.859 } 00:21:06.859 ], 00:21:06.859 "driver_specific": {} 00:21:06.859 } 00:21:06.859 ] 00:21:06.859 09:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:06.859 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:06.859 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:06.859 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:07.118 BaseBdev3 00:21:07.118 09:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:07.118 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:07.377 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:07.637 [ 00:21:07.637 { 00:21:07.637 "name": "BaseBdev3", 00:21:07.637 "aliases": [ 00:21:07.637 "61f3f537-f326-458a-b41f-cc12c2623092" 00:21:07.637 ], 00:21:07.637 "product_name": "Malloc disk", 00:21:07.637 "block_size": 512, 00:21:07.637 "num_blocks": 65536, 00:21:07.637 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:07.637 "assigned_rate_limits": { 00:21:07.637 "rw_ios_per_sec": 0, 00:21:07.637 "rw_mbytes_per_sec": 0, 00:21:07.637 "r_mbytes_per_sec": 0, 00:21:07.637 "w_mbytes_per_sec": 0 00:21:07.637 }, 00:21:07.637 "claimed": false, 00:21:07.637 "zoned": false, 00:21:07.637 "supported_io_types": { 00:21:07.637 "read": true, 00:21:07.637 "write": true, 00:21:07.637 "unmap": true, 00:21:07.637 "flush": true, 00:21:07.637 "reset": true, 00:21:07.637 "nvme_admin": false, 00:21:07.637 "nvme_io": false, 00:21:07.637 "nvme_io_md": false, 00:21:07.637 "write_zeroes": true, 00:21:07.637 "zcopy": true, 00:21:07.637 "get_zone_info": false, 00:21:07.637 "zone_management": false, 00:21:07.637 "zone_append": false, 00:21:07.637 "compare": false, 00:21:07.637 "compare_and_write": false, 00:21:07.637 "abort": true, 00:21:07.637 "seek_hole": false, 00:21:07.637 "seek_data": false, 00:21:07.637 "copy": true, 00:21:07.637 "nvme_iov_md": false 00:21:07.637 }, 00:21:07.637 "memory_domains": [ 00:21:07.637 { 00:21:07.637 "dma_device_id": "system", 00:21:07.637 "dma_device_type": 1 00:21:07.637 }, 00:21:07.637 { 00:21:07.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.637 "dma_device_type": 2 00:21:07.637 } 00:21:07.637 ], 00:21:07.637 "driver_specific": {} 00:21:07.637 } 00:21:07.637 ] 00:21:07.637 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:07.637 09:25:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:07.637 09:25:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:07.637 09:25:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:07.896 BaseBdev4 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:07.896 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.156 09:25:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:08.414 [ 00:21:08.415 { 00:21:08.415 "name": "BaseBdev4", 00:21:08.415 "aliases": [ 00:21:08.415 "db4fa688-a167-44e1-a5a9-7bed565e58ec" 00:21:08.415 ], 00:21:08.415 "product_name": "Malloc disk", 00:21:08.415 "block_size": 512, 00:21:08.415 "num_blocks": 65536, 00:21:08.415 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:08.415 "assigned_rate_limits": { 00:21:08.415 "rw_ios_per_sec": 0, 00:21:08.415 "rw_mbytes_per_sec": 0, 00:21:08.415 "r_mbytes_per_sec": 0, 00:21:08.415 "w_mbytes_per_sec": 0 00:21:08.415 }, 00:21:08.415 "claimed": false, 00:21:08.415 "zoned": false, 00:21:08.415 "supported_io_types": { 00:21:08.415 "read": true, 00:21:08.415 "write": true, 00:21:08.415 "unmap": true, 00:21:08.415 "flush": true, 00:21:08.415 "reset": true, 00:21:08.415 "nvme_admin": false, 00:21:08.415 "nvme_io": false, 00:21:08.415 "nvme_io_md": false, 00:21:08.415 "write_zeroes": true, 00:21:08.415 "zcopy": true, 00:21:08.415 "get_zone_info": false, 00:21:08.415 "zone_management": false, 00:21:08.415 "zone_append": false, 00:21:08.415 "compare": false, 00:21:08.415 "compare_and_write": false, 00:21:08.415 "abort": true, 00:21:08.415 "seek_hole": false, 00:21:08.415 "seek_data": false, 00:21:08.415 "copy": true, 00:21:08.415 "nvme_iov_md": false 00:21:08.415 }, 00:21:08.415 "memory_domains": [ 00:21:08.415 { 00:21:08.415 "dma_device_id": "system", 00:21:08.415 "dma_device_type": 1 00:21:08.415 }, 00:21:08.415 { 00:21:08.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.415 "dma_device_type": 2 00:21:08.415 } 00:21:08.415 ], 00:21:08.415 "driver_specific": {} 00:21:08.415 } 00:21:08.415 ] 00:21:08.415 09:25:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:08.415 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:08.415 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.415 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:08.674 [2024-07-15 09:25:17.448651] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:08.674 [2024-07-15 09:25:17.448700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:08.674 [2024-07-15 09:25:17.448721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.674 [2024-07-15 09:25:17.450334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.674 [2024-07-15 09:25:17.450383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.674 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.933 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.933 "name": "Existed_Raid", 00:21:08.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.933 "strip_size_kb": 0, 00:21:08.933 "state": "configuring", 00:21:08.933 "raid_level": "raid1", 00:21:08.933 "superblock": false, 00:21:08.933 "num_base_bdevs": 4, 00:21:08.933 "num_base_bdevs_discovered": 3, 00:21:08.933 "num_base_bdevs_operational": 4, 00:21:08.933 "base_bdevs_list": [ 00:21:08.933 { 00:21:08.933 "name": "BaseBdev1", 00:21:08.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.933 "is_configured": false, 00:21:08.933 "data_offset": 0, 00:21:08.933 "data_size": 0 00:21:08.933 }, 00:21:08.933 { 00:21:08.933 "name": "BaseBdev2", 00:21:08.933 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:08.933 "is_configured": true, 00:21:08.933 "data_offset": 0, 00:21:08.933 "data_size": 65536 00:21:08.933 }, 00:21:08.933 { 00:21:08.933 "name": "BaseBdev3", 00:21:08.933 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:08.933 "is_configured": true, 00:21:08.933 "data_offset": 0, 00:21:08.933 "data_size": 65536 00:21:08.933 }, 00:21:08.933 { 00:21:08.933 "name": "BaseBdev4", 00:21:08.933 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:08.933 "is_configured": true, 00:21:08.933 "data_offset": 0, 00:21:08.933 "data_size": 65536 00:21:08.933 } 00:21:08.933 ] 00:21:08.933 }' 00:21:08.933 09:25:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.933 09:25:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:09.504 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:10.072 [2024-07-15 09:25:18.832304] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.072 09:25:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.331 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.331 "name": "Existed_Raid", 00:21:10.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.331 "strip_size_kb": 0, 00:21:10.331 "state": "configuring", 00:21:10.331 "raid_level": "raid1", 00:21:10.331 "superblock": false, 00:21:10.331 "num_base_bdevs": 4, 00:21:10.331 "num_base_bdevs_discovered": 2, 00:21:10.331 "num_base_bdevs_operational": 4, 00:21:10.331 "base_bdevs_list": [ 00:21:10.331 { 00:21:10.331 "name": "BaseBdev1", 00:21:10.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.331 "is_configured": false, 00:21:10.331 "data_offset": 0, 00:21:10.331 "data_size": 0 00:21:10.331 }, 00:21:10.331 { 00:21:10.331 "name": null, 00:21:10.331 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:10.331 "is_configured": false, 00:21:10.331 "data_offset": 0, 00:21:10.331 "data_size": 65536 00:21:10.331 }, 00:21:10.331 { 00:21:10.331 "name": "BaseBdev3", 00:21:10.331 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:10.331 "is_configured": true, 00:21:10.331 "data_offset": 0, 00:21:10.331 "data_size": 65536 00:21:10.331 }, 00:21:10.331 { 00:21:10.331 "name": "BaseBdev4", 00:21:10.331 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:10.331 "is_configured": true, 00:21:10.331 "data_offset": 0, 00:21:10.331 "data_size": 65536 00:21:10.331 } 00:21:10.331 ] 00:21:10.331 }' 00:21:10.331 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.331 09:25:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.899 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.899 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:11.158 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:11.158 09:25:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:11.418 [2024-07-15 09:25:20.194761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:11.418 BaseBdev1 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:11.418 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.677 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:11.677 [ 00:21:11.677 { 00:21:11.677 "name": "BaseBdev1", 00:21:11.677 "aliases": [ 00:21:11.677 "53a5d617-5db6-4d2f-8d43-8aed67289eeb" 00:21:11.677 ], 00:21:11.677 "product_name": "Malloc disk", 00:21:11.677 "block_size": 512, 00:21:11.677 "num_blocks": 65536, 00:21:11.677 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:11.677 "assigned_rate_limits": { 00:21:11.677 "rw_ios_per_sec": 0, 00:21:11.677 "rw_mbytes_per_sec": 0, 00:21:11.677 "r_mbytes_per_sec": 0, 00:21:11.677 "w_mbytes_per_sec": 0 00:21:11.677 }, 00:21:11.677 "claimed": true, 00:21:11.677 "claim_type": "exclusive_write", 00:21:11.677 "zoned": false, 00:21:11.677 "supported_io_types": { 00:21:11.677 "read": true, 00:21:11.677 "write": true, 00:21:11.677 "unmap": true, 00:21:11.677 "flush": true, 00:21:11.677 "reset": true, 00:21:11.677 "nvme_admin": false, 00:21:11.677 "nvme_io": false, 00:21:11.677 "nvme_io_md": false, 00:21:11.677 "write_zeroes": true, 00:21:11.677 "zcopy": true, 00:21:11.677 "get_zone_info": false, 00:21:11.677 "zone_management": false, 00:21:11.677 "zone_append": false, 00:21:11.677 "compare": false, 00:21:11.677 "compare_and_write": false, 00:21:11.677 "abort": true, 00:21:11.677 "seek_hole": false, 00:21:11.677 "seek_data": false, 00:21:11.677 "copy": true, 00:21:11.677 "nvme_iov_md": false 00:21:11.677 }, 00:21:11.677 "memory_domains": [ 00:21:11.677 { 00:21:11.677 "dma_device_id": "system", 00:21:11.678 "dma_device_type": 1 00:21:11.678 }, 00:21:11.678 { 00:21:11.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.678 "dma_device_type": 2 00:21:11.678 } 00:21:11.678 ], 00:21:11.678 "driver_specific": {} 00:21:11.678 } 00:21:11.678 ] 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.678 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:11.937 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.937 "name": "Existed_Raid", 00:21:11.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.937 "strip_size_kb": 0, 00:21:11.937 "state": "configuring", 00:21:11.937 "raid_level": "raid1", 00:21:11.937 "superblock": false, 00:21:11.937 "num_base_bdevs": 4, 00:21:11.937 "num_base_bdevs_discovered": 3, 00:21:11.937 "num_base_bdevs_operational": 4, 00:21:11.937 "base_bdevs_list": [ 00:21:11.937 { 00:21:11.937 "name": "BaseBdev1", 00:21:11.937 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:11.937 "is_configured": true, 00:21:11.937 "data_offset": 0, 00:21:11.937 "data_size": 65536 00:21:11.937 }, 00:21:11.937 { 00:21:11.937 "name": null, 00:21:11.937 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:11.937 "is_configured": false, 00:21:11.937 "data_offset": 0, 00:21:11.937 "data_size": 65536 00:21:11.937 }, 00:21:11.937 { 00:21:11.937 "name": "BaseBdev3", 00:21:11.937 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:11.937 "is_configured": true, 00:21:11.937 "data_offset": 0, 00:21:11.937 "data_size": 65536 00:21:11.937 }, 00:21:11.937 { 00:21:11.937 "name": "BaseBdev4", 00:21:11.937 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:11.937 "is_configured": true, 00:21:11.937 "data_offset": 0, 00:21:11.937 "data_size": 65536 00:21:11.937 } 00:21:11.937 ] 00:21:11.937 }' 00:21:11.937 09:25:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.937 09:25:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.505 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:12.505 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.764 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:12.764 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:13.024 [2024-07-15 09:25:21.887246] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.024 09:25:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.284 09:25:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.284 "name": "Existed_Raid", 00:21:13.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.284 "strip_size_kb": 0, 00:21:13.284 "state": "configuring", 00:21:13.284 "raid_level": "raid1", 00:21:13.284 "superblock": false, 00:21:13.284 "num_base_bdevs": 4, 00:21:13.284 "num_base_bdevs_discovered": 2, 00:21:13.284 "num_base_bdevs_operational": 4, 00:21:13.284 "base_bdevs_list": [ 00:21:13.284 { 00:21:13.284 "name": "BaseBdev1", 00:21:13.284 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:13.284 "is_configured": true, 00:21:13.284 "data_offset": 0, 00:21:13.284 "data_size": 65536 00:21:13.284 }, 00:21:13.284 { 00:21:13.284 "name": null, 00:21:13.284 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:13.284 "is_configured": false, 00:21:13.284 "data_offset": 0, 00:21:13.284 "data_size": 65536 00:21:13.284 }, 00:21:13.284 { 00:21:13.284 "name": null, 00:21:13.284 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:13.284 "is_configured": false, 00:21:13.284 "data_offset": 0, 00:21:13.284 "data_size": 65536 00:21:13.284 }, 00:21:13.284 { 00:21:13.284 "name": "BaseBdev4", 00:21:13.284 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:13.284 "is_configured": true, 00:21:13.284 "data_offset": 0, 00:21:13.284 "data_size": 65536 00:21:13.284 } 00:21:13.284 ] 00:21:13.284 }' 00:21:13.284 09:25:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.284 09:25:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.853 09:25:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.853 09:25:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:14.111 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:14.111 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:14.369 [2024-07-15 09:25:23.234850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.370 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.627 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.627 "name": "Existed_Raid", 00:21:14.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.627 "strip_size_kb": 0, 00:21:14.627 "state": "configuring", 00:21:14.627 "raid_level": "raid1", 00:21:14.627 "superblock": false, 00:21:14.627 "num_base_bdevs": 4, 00:21:14.627 "num_base_bdevs_discovered": 3, 00:21:14.627 "num_base_bdevs_operational": 4, 00:21:14.627 "base_bdevs_list": [ 00:21:14.627 { 00:21:14.627 "name": "BaseBdev1", 00:21:14.627 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:14.627 "is_configured": true, 00:21:14.627 "data_offset": 0, 00:21:14.627 "data_size": 65536 00:21:14.627 }, 00:21:14.627 { 00:21:14.627 "name": null, 00:21:14.627 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:14.627 "is_configured": false, 00:21:14.627 "data_offset": 0, 00:21:14.627 "data_size": 65536 00:21:14.627 }, 00:21:14.627 { 00:21:14.627 "name": "BaseBdev3", 00:21:14.627 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:14.627 "is_configured": true, 00:21:14.627 "data_offset": 0, 00:21:14.627 "data_size": 65536 00:21:14.627 }, 00:21:14.627 { 00:21:14.627 "name": "BaseBdev4", 00:21:14.627 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:14.627 "is_configured": true, 00:21:14.627 "data_offset": 0, 00:21:14.627 "data_size": 65536 00:21:14.627 } 00:21:14.627 ] 00:21:14.627 }' 00:21:14.627 09:25:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.627 09:25:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:15.193 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.193 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:15.451 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:15.451 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:15.709 [2024-07-15 09:25:24.530313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.709 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.968 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.968 "name": "Existed_Raid", 00:21:15.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.968 "strip_size_kb": 0, 00:21:15.968 "state": "configuring", 00:21:15.968 "raid_level": "raid1", 00:21:15.968 "superblock": false, 00:21:15.968 "num_base_bdevs": 4, 00:21:15.968 "num_base_bdevs_discovered": 2, 00:21:15.968 "num_base_bdevs_operational": 4, 00:21:15.968 "base_bdevs_list": [ 00:21:15.968 { 00:21:15.968 "name": null, 00:21:15.968 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:15.968 "is_configured": false, 00:21:15.968 "data_offset": 0, 00:21:15.968 "data_size": 65536 00:21:15.968 }, 00:21:15.968 { 00:21:15.968 "name": null, 00:21:15.968 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:15.968 "is_configured": false, 00:21:15.968 "data_offset": 0, 00:21:15.968 "data_size": 65536 00:21:15.968 }, 00:21:15.968 { 00:21:15.968 "name": "BaseBdev3", 00:21:15.968 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:15.968 "is_configured": true, 00:21:15.968 "data_offset": 0, 00:21:15.968 "data_size": 65536 00:21:15.968 }, 00:21:15.968 { 00:21:15.968 "name": "BaseBdev4", 00:21:15.968 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:15.968 "is_configured": true, 00:21:15.968 "data_offset": 0, 00:21:15.968 "data_size": 65536 00:21:15.968 } 00:21:15.968 ] 00:21:15.968 }' 00:21:15.968 09:25:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.968 09:25:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.535 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.535 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:16.795 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:16.795 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:17.054 [2024-07-15 09:25:25.925357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.054 09:25:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.312 09:25:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.312 "name": "Existed_Raid", 00:21:17.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.312 "strip_size_kb": 0, 00:21:17.312 "state": "configuring", 00:21:17.312 "raid_level": "raid1", 00:21:17.312 "superblock": false, 00:21:17.312 "num_base_bdevs": 4, 00:21:17.312 "num_base_bdevs_discovered": 3, 00:21:17.312 "num_base_bdevs_operational": 4, 00:21:17.312 "base_bdevs_list": [ 00:21:17.312 { 00:21:17.312 "name": null, 00:21:17.312 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:17.312 "is_configured": false, 00:21:17.312 "data_offset": 0, 00:21:17.312 "data_size": 65536 00:21:17.312 }, 00:21:17.312 { 00:21:17.312 "name": "BaseBdev2", 00:21:17.312 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:17.312 "is_configured": true, 00:21:17.312 "data_offset": 0, 00:21:17.312 "data_size": 65536 00:21:17.312 }, 00:21:17.312 { 00:21:17.312 "name": "BaseBdev3", 00:21:17.312 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:17.312 "is_configured": true, 00:21:17.312 "data_offset": 0, 00:21:17.312 "data_size": 65536 00:21:17.312 }, 00:21:17.312 { 00:21:17.312 "name": "BaseBdev4", 00:21:17.312 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:17.312 "is_configured": true, 00:21:17.312 "data_offset": 0, 00:21:17.312 "data_size": 65536 00:21:17.312 } 00:21:17.312 ] 00:21:17.312 }' 00:21:17.312 09:25:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.312 09:25:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:17.878 09:25:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.878 09:25:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:18.136 09:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:18.136 09:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:18.136 09:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.394 09:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 53a5d617-5db6-4d2f-8d43-8aed67289eeb 00:21:18.652 [2024-07-15 09:25:27.538619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:18.652 [2024-07-15 09:25:27.538669] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8f7610 00:21:18.652 [2024-07-15 09:25:27.538678] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:18.652 [2024-07-15 09:25:27.538905] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f8a70 00:21:18.652 [2024-07-15 09:25:27.539067] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8f7610 00:21:18.652 [2024-07-15 09:25:27.539078] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8f7610 00:21:18.652 [2024-07-15 09:25:27.539265] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.652 NewBaseBdev 00:21:18.652 09:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:18.652 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:18.652 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:18.653 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:18.653 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:18.653 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:18.653 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:18.910 09:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:19.168 [ 00:21:19.168 { 00:21:19.168 "name": "NewBaseBdev", 00:21:19.168 "aliases": [ 00:21:19.168 "53a5d617-5db6-4d2f-8d43-8aed67289eeb" 00:21:19.168 ], 00:21:19.168 "product_name": "Malloc disk", 00:21:19.168 "block_size": 512, 00:21:19.168 "num_blocks": 65536, 00:21:19.168 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:19.168 "assigned_rate_limits": { 00:21:19.168 "rw_ios_per_sec": 0, 00:21:19.168 "rw_mbytes_per_sec": 0, 00:21:19.168 "r_mbytes_per_sec": 0, 00:21:19.168 "w_mbytes_per_sec": 0 00:21:19.168 }, 00:21:19.168 "claimed": true, 00:21:19.168 "claim_type": "exclusive_write", 00:21:19.168 "zoned": false, 00:21:19.168 "supported_io_types": { 00:21:19.168 "read": true, 00:21:19.168 "write": true, 00:21:19.168 "unmap": true, 00:21:19.168 "flush": true, 00:21:19.168 "reset": true, 00:21:19.168 "nvme_admin": false, 00:21:19.168 "nvme_io": false, 00:21:19.168 "nvme_io_md": false, 00:21:19.168 "write_zeroes": true, 00:21:19.168 "zcopy": true, 00:21:19.168 "get_zone_info": false, 00:21:19.168 "zone_management": false, 00:21:19.168 "zone_append": false, 00:21:19.168 "compare": false, 00:21:19.168 "compare_and_write": false, 00:21:19.168 "abort": true, 00:21:19.168 "seek_hole": false, 00:21:19.169 "seek_data": false, 00:21:19.169 "copy": true, 00:21:19.169 "nvme_iov_md": false 00:21:19.169 }, 00:21:19.169 "memory_domains": [ 00:21:19.169 { 00:21:19.169 "dma_device_id": "system", 00:21:19.169 "dma_device_type": 1 00:21:19.169 }, 00:21:19.169 { 00:21:19.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.169 "dma_device_type": 2 00:21:19.169 } 00:21:19.169 ], 00:21:19.169 "driver_specific": {} 00:21:19.169 } 00:21:19.169 ] 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.169 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.427 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.427 "name": "Existed_Raid", 00:21:19.427 "uuid": "668e68db-af71-489f-9550-335254a48f53", 00:21:19.427 "strip_size_kb": 0, 00:21:19.427 "state": "online", 00:21:19.427 "raid_level": "raid1", 00:21:19.427 "superblock": false, 00:21:19.427 "num_base_bdevs": 4, 00:21:19.427 "num_base_bdevs_discovered": 4, 00:21:19.427 "num_base_bdevs_operational": 4, 00:21:19.427 "base_bdevs_list": [ 00:21:19.427 { 00:21:19.428 "name": "NewBaseBdev", 00:21:19.428 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:19.428 "is_configured": true, 00:21:19.428 "data_offset": 0, 00:21:19.428 "data_size": 65536 00:21:19.428 }, 00:21:19.428 { 00:21:19.428 "name": "BaseBdev2", 00:21:19.428 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:19.428 "is_configured": true, 00:21:19.428 "data_offset": 0, 00:21:19.428 "data_size": 65536 00:21:19.428 }, 00:21:19.428 { 00:21:19.428 "name": "BaseBdev3", 00:21:19.428 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:19.428 "is_configured": true, 00:21:19.428 "data_offset": 0, 00:21:19.428 "data_size": 65536 00:21:19.428 }, 00:21:19.428 { 00:21:19.428 "name": "BaseBdev4", 00:21:19.428 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:19.428 "is_configured": true, 00:21:19.428 "data_offset": 0, 00:21:19.428 "data_size": 65536 00:21:19.428 } 00:21:19.428 ] 00:21:19.428 }' 00:21:19.428 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.428 09:25:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:20.038 09:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:20.321 [2024-07-15 09:25:29.143189] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:20.321 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:20.321 "name": "Existed_Raid", 00:21:20.321 "aliases": [ 00:21:20.321 "668e68db-af71-489f-9550-335254a48f53" 00:21:20.321 ], 00:21:20.321 "product_name": "Raid Volume", 00:21:20.321 "block_size": 512, 00:21:20.321 "num_blocks": 65536, 00:21:20.321 "uuid": "668e68db-af71-489f-9550-335254a48f53", 00:21:20.321 "assigned_rate_limits": { 00:21:20.321 "rw_ios_per_sec": 0, 00:21:20.321 "rw_mbytes_per_sec": 0, 00:21:20.321 "r_mbytes_per_sec": 0, 00:21:20.321 "w_mbytes_per_sec": 0 00:21:20.321 }, 00:21:20.321 "claimed": false, 00:21:20.321 "zoned": false, 00:21:20.321 "supported_io_types": { 00:21:20.321 "read": true, 00:21:20.321 "write": true, 00:21:20.321 "unmap": false, 00:21:20.321 "flush": false, 00:21:20.321 "reset": true, 00:21:20.321 "nvme_admin": false, 00:21:20.321 "nvme_io": false, 00:21:20.321 "nvme_io_md": false, 00:21:20.321 "write_zeroes": true, 00:21:20.321 "zcopy": false, 00:21:20.321 "get_zone_info": false, 00:21:20.321 "zone_management": false, 00:21:20.321 "zone_append": false, 00:21:20.321 "compare": false, 00:21:20.321 "compare_and_write": false, 00:21:20.321 "abort": false, 00:21:20.321 "seek_hole": false, 00:21:20.321 "seek_data": false, 00:21:20.321 "copy": false, 00:21:20.321 "nvme_iov_md": false 00:21:20.321 }, 00:21:20.321 "memory_domains": [ 00:21:20.321 { 00:21:20.322 "dma_device_id": "system", 00:21:20.322 "dma_device_type": 1 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.322 "dma_device_type": 2 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "system", 00:21:20.322 "dma_device_type": 1 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.322 "dma_device_type": 2 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "system", 00:21:20.322 "dma_device_type": 1 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.322 "dma_device_type": 2 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "system", 00:21:20.322 "dma_device_type": 1 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.322 "dma_device_type": 2 00:21:20.322 } 00:21:20.322 ], 00:21:20.322 "driver_specific": { 00:21:20.322 "raid": { 00:21:20.322 "uuid": "668e68db-af71-489f-9550-335254a48f53", 00:21:20.322 "strip_size_kb": 0, 00:21:20.322 "state": "online", 00:21:20.322 "raid_level": "raid1", 00:21:20.322 "superblock": false, 00:21:20.322 "num_base_bdevs": 4, 00:21:20.322 "num_base_bdevs_discovered": 4, 00:21:20.322 "num_base_bdevs_operational": 4, 00:21:20.322 "base_bdevs_list": [ 00:21:20.322 { 00:21:20.322 "name": "NewBaseBdev", 00:21:20.322 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:20.322 "is_configured": true, 00:21:20.322 "data_offset": 0, 00:21:20.322 "data_size": 65536 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "name": "BaseBdev2", 00:21:20.322 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:20.322 "is_configured": true, 00:21:20.322 "data_offset": 0, 00:21:20.322 "data_size": 65536 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "name": "BaseBdev3", 00:21:20.322 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:20.322 "is_configured": true, 00:21:20.322 "data_offset": 0, 00:21:20.322 "data_size": 65536 00:21:20.322 }, 00:21:20.322 { 00:21:20.322 "name": "BaseBdev4", 00:21:20.322 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:20.322 "is_configured": true, 00:21:20.322 "data_offset": 0, 00:21:20.322 "data_size": 65536 00:21:20.322 } 00:21:20.322 ] 00:21:20.322 } 00:21:20.322 } 00:21:20.322 }' 00:21:20.322 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:20.322 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:20.322 BaseBdev2 00:21:20.322 BaseBdev3 00:21:20.322 BaseBdev4' 00:21:20.322 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:20.322 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:20.322 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.581 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.581 "name": "NewBaseBdev", 00:21:20.581 "aliases": [ 00:21:20.581 "53a5d617-5db6-4d2f-8d43-8aed67289eeb" 00:21:20.581 ], 00:21:20.581 "product_name": "Malloc disk", 00:21:20.581 "block_size": 512, 00:21:20.581 "num_blocks": 65536, 00:21:20.581 "uuid": "53a5d617-5db6-4d2f-8d43-8aed67289eeb", 00:21:20.581 "assigned_rate_limits": { 00:21:20.581 "rw_ios_per_sec": 0, 00:21:20.581 "rw_mbytes_per_sec": 0, 00:21:20.581 "r_mbytes_per_sec": 0, 00:21:20.581 "w_mbytes_per_sec": 0 00:21:20.581 }, 00:21:20.581 "claimed": true, 00:21:20.581 "claim_type": "exclusive_write", 00:21:20.581 "zoned": false, 00:21:20.581 "supported_io_types": { 00:21:20.581 "read": true, 00:21:20.581 "write": true, 00:21:20.581 "unmap": true, 00:21:20.581 "flush": true, 00:21:20.581 "reset": true, 00:21:20.581 "nvme_admin": false, 00:21:20.581 "nvme_io": false, 00:21:20.581 "nvme_io_md": false, 00:21:20.581 "write_zeroes": true, 00:21:20.581 "zcopy": true, 00:21:20.581 "get_zone_info": false, 00:21:20.581 "zone_management": false, 00:21:20.581 "zone_append": false, 00:21:20.581 "compare": false, 00:21:20.581 "compare_and_write": false, 00:21:20.581 "abort": true, 00:21:20.581 "seek_hole": false, 00:21:20.581 "seek_data": false, 00:21:20.581 "copy": true, 00:21:20.581 "nvme_iov_md": false 00:21:20.581 }, 00:21:20.581 "memory_domains": [ 00:21:20.581 { 00:21:20.581 "dma_device_id": "system", 00:21:20.581 "dma_device_type": 1 00:21:20.581 }, 00:21:20.581 { 00:21:20.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.581 "dma_device_type": 2 00:21:20.581 } 00:21:20.581 ], 00:21:20.581 "driver_specific": {} 00:21:20.581 }' 00:21:20.581 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.581 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.840 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.840 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.840 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.840 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.841 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.841 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.841 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.841 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.841 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.100 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.100 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.100 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:21.100 09:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.359 "name": "BaseBdev2", 00:21:21.359 "aliases": [ 00:21:21.359 "f9bb630a-2d7b-480d-a5c7-44992b877f5b" 00:21:21.359 ], 00:21:21.359 "product_name": "Malloc disk", 00:21:21.359 "block_size": 512, 00:21:21.359 "num_blocks": 65536, 00:21:21.359 "uuid": "f9bb630a-2d7b-480d-a5c7-44992b877f5b", 00:21:21.359 "assigned_rate_limits": { 00:21:21.359 "rw_ios_per_sec": 0, 00:21:21.359 "rw_mbytes_per_sec": 0, 00:21:21.359 "r_mbytes_per_sec": 0, 00:21:21.359 "w_mbytes_per_sec": 0 00:21:21.359 }, 00:21:21.359 "claimed": true, 00:21:21.359 "claim_type": "exclusive_write", 00:21:21.359 "zoned": false, 00:21:21.359 "supported_io_types": { 00:21:21.359 "read": true, 00:21:21.359 "write": true, 00:21:21.359 "unmap": true, 00:21:21.359 "flush": true, 00:21:21.359 "reset": true, 00:21:21.359 "nvme_admin": false, 00:21:21.359 "nvme_io": false, 00:21:21.359 "nvme_io_md": false, 00:21:21.359 "write_zeroes": true, 00:21:21.359 "zcopy": true, 00:21:21.359 "get_zone_info": false, 00:21:21.359 "zone_management": false, 00:21:21.359 "zone_append": false, 00:21:21.359 "compare": false, 00:21:21.359 "compare_and_write": false, 00:21:21.359 "abort": true, 00:21:21.359 "seek_hole": false, 00:21:21.359 "seek_data": false, 00:21:21.359 "copy": true, 00:21:21.359 "nvme_iov_md": false 00:21:21.359 }, 00:21:21.359 "memory_domains": [ 00:21:21.359 { 00:21:21.359 "dma_device_id": "system", 00:21:21.359 "dma_device_type": 1 00:21:21.359 }, 00:21:21.359 { 00:21:21.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.359 "dma_device_type": 2 00:21:21.359 } 00:21:21.359 ], 00:21:21.359 "driver_specific": {} 00:21:21.359 }' 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.359 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.618 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.618 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.618 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.618 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.618 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.619 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.619 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:21.619 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.877 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.877 "name": "BaseBdev3", 00:21:21.877 "aliases": [ 00:21:21.877 "61f3f537-f326-458a-b41f-cc12c2623092" 00:21:21.877 ], 00:21:21.877 "product_name": "Malloc disk", 00:21:21.877 "block_size": 512, 00:21:21.877 "num_blocks": 65536, 00:21:21.877 "uuid": "61f3f537-f326-458a-b41f-cc12c2623092", 00:21:21.877 "assigned_rate_limits": { 00:21:21.877 "rw_ios_per_sec": 0, 00:21:21.877 "rw_mbytes_per_sec": 0, 00:21:21.877 "r_mbytes_per_sec": 0, 00:21:21.877 "w_mbytes_per_sec": 0 00:21:21.877 }, 00:21:21.877 "claimed": true, 00:21:21.877 "claim_type": "exclusive_write", 00:21:21.877 "zoned": false, 00:21:21.877 "supported_io_types": { 00:21:21.877 "read": true, 00:21:21.877 "write": true, 00:21:21.877 "unmap": true, 00:21:21.877 "flush": true, 00:21:21.877 "reset": true, 00:21:21.877 "nvme_admin": false, 00:21:21.877 "nvme_io": false, 00:21:21.877 "nvme_io_md": false, 00:21:21.877 "write_zeroes": true, 00:21:21.877 "zcopy": true, 00:21:21.877 "get_zone_info": false, 00:21:21.877 "zone_management": false, 00:21:21.877 "zone_append": false, 00:21:21.877 "compare": false, 00:21:21.877 "compare_and_write": false, 00:21:21.877 "abort": true, 00:21:21.877 "seek_hole": false, 00:21:21.877 "seek_data": false, 00:21:21.877 "copy": true, 00:21:21.877 "nvme_iov_md": false 00:21:21.877 }, 00:21:21.877 "memory_domains": [ 00:21:21.878 { 00:21:21.878 "dma_device_id": "system", 00:21:21.878 "dma_device_type": 1 00:21:21.878 }, 00:21:21.878 { 00:21:21.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.878 "dma_device_type": 2 00:21:21.878 } 00:21:21.878 ], 00:21:21.878 "driver_specific": {} 00:21:21.878 }' 00:21:21.878 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.878 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.878 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.878 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.136 09:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.136 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.136 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.136 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.136 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:22.136 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.394 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.394 "name": "BaseBdev4", 00:21:22.395 "aliases": [ 00:21:22.395 "db4fa688-a167-44e1-a5a9-7bed565e58ec" 00:21:22.395 ], 00:21:22.395 "product_name": "Malloc disk", 00:21:22.395 "block_size": 512, 00:21:22.395 "num_blocks": 65536, 00:21:22.395 "uuid": "db4fa688-a167-44e1-a5a9-7bed565e58ec", 00:21:22.395 "assigned_rate_limits": { 00:21:22.395 "rw_ios_per_sec": 0, 00:21:22.395 "rw_mbytes_per_sec": 0, 00:21:22.395 "r_mbytes_per_sec": 0, 00:21:22.395 "w_mbytes_per_sec": 0 00:21:22.395 }, 00:21:22.395 "claimed": true, 00:21:22.395 "claim_type": "exclusive_write", 00:21:22.395 "zoned": false, 00:21:22.395 "supported_io_types": { 00:21:22.395 "read": true, 00:21:22.395 "write": true, 00:21:22.395 "unmap": true, 00:21:22.395 "flush": true, 00:21:22.395 "reset": true, 00:21:22.395 "nvme_admin": false, 00:21:22.395 "nvme_io": false, 00:21:22.395 "nvme_io_md": false, 00:21:22.395 "write_zeroes": true, 00:21:22.395 "zcopy": true, 00:21:22.395 "get_zone_info": false, 00:21:22.395 "zone_management": false, 00:21:22.395 "zone_append": false, 00:21:22.395 "compare": false, 00:21:22.395 "compare_and_write": false, 00:21:22.395 "abort": true, 00:21:22.395 "seek_hole": false, 00:21:22.395 "seek_data": false, 00:21:22.395 "copy": true, 00:21:22.395 "nvme_iov_md": false 00:21:22.395 }, 00:21:22.395 "memory_domains": [ 00:21:22.395 { 00:21:22.395 "dma_device_id": "system", 00:21:22.395 "dma_device_type": 1 00:21:22.395 }, 00:21:22.395 { 00:21:22.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.395 "dma_device_type": 2 00:21:22.395 } 00:21:22.395 ], 00:21:22.395 "driver_specific": {} 00:21:22.395 }' 00:21:22.395 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.653 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.911 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.911 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.911 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:23.170 [2024-07-15 09:25:31.890181] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:23.170 [2024-07-15 09:25:31.890212] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:23.170 [2024-07-15 09:25:31.890270] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.170 [2024-07-15 09:25:31.890568] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.170 [2024-07-15 09:25:31.890587] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f7610 name Existed_Raid, state offline 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 174649 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 174649 ']' 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 174649 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 174649 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 174649' 00:21:23.170 killing process with pid 174649 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 174649 00:21:23.170 [2024-07-15 09:25:31.960516] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.170 09:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 174649 00:21:23.170 [2024-07-15 09:25:32.040382] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:23.740 00:21:23.740 real 0m33.050s 00:21:23.740 user 1m0.528s 00:21:23.740 sys 0m5.818s 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.740 ************************************ 00:21:23.740 END TEST raid_state_function_test 00:21:23.740 ************************************ 00:21:23.740 09:25:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:23.740 09:25:32 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:23.740 09:25:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:23.740 09:25:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:23.740 09:25:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:23.740 ************************************ 00:21:23.740 START TEST raid_state_function_test_sb 00:21:23.740 ************************************ 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=179707 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 179707' 00:21:23.740 Process raid pid: 179707 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 179707 /var/tmp/spdk-raid.sock 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 179707 ']' 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:23.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:23.740 09:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:23.740 [2024-07-15 09:25:32.594493] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:21:23.740 [2024-07-15 09:25:32.594561] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:23.999 [2024-07-15 09:25:32.725784] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:23.999 [2024-07-15 09:25:32.826501] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:23.999 [2024-07-15 09:25:32.889507] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:23.999 [2024-07-15 09:25:32.889542] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:24.936 [2024-07-15 09:25:33.748596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:24.936 [2024-07-15 09:25:33.748640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:24.936 [2024-07-15 09:25:33.748652] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:24.936 [2024-07-15 09:25:33.748668] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:24.936 [2024-07-15 09:25:33.748677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:24.936 [2024-07-15 09:25:33.748689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:24.936 [2024-07-15 09:25:33.748697] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:24.936 [2024-07-15 09:25:33.748708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.936 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.937 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.937 09:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:25.196 09:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.196 "name": "Existed_Raid", 00:21:25.196 "uuid": "034e115e-a365-4e41-a42d-fe19170dbc90", 00:21:25.196 "strip_size_kb": 0, 00:21:25.196 "state": "configuring", 00:21:25.196 "raid_level": "raid1", 00:21:25.196 "superblock": true, 00:21:25.196 "num_base_bdevs": 4, 00:21:25.196 "num_base_bdevs_discovered": 0, 00:21:25.196 "num_base_bdevs_operational": 4, 00:21:25.196 "base_bdevs_list": [ 00:21:25.196 { 00:21:25.196 "name": "BaseBdev1", 00:21:25.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.196 "is_configured": false, 00:21:25.196 "data_offset": 0, 00:21:25.196 "data_size": 0 00:21:25.196 }, 00:21:25.196 { 00:21:25.196 "name": "BaseBdev2", 00:21:25.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.196 "is_configured": false, 00:21:25.196 "data_offset": 0, 00:21:25.196 "data_size": 0 00:21:25.196 }, 00:21:25.196 { 00:21:25.196 "name": "BaseBdev3", 00:21:25.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.196 "is_configured": false, 00:21:25.196 "data_offset": 0, 00:21:25.196 "data_size": 0 00:21:25.196 }, 00:21:25.196 { 00:21:25.196 "name": "BaseBdev4", 00:21:25.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.196 "is_configured": false, 00:21:25.196 "data_offset": 0, 00:21:25.196 "data_size": 0 00:21:25.196 } 00:21:25.196 ] 00:21:25.196 }' 00:21:25.196 09:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.196 09:25:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:25.763 09:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:26.022 [2024-07-15 09:25:34.835328] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:26.022 [2024-07-15 09:25:34.835356] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130faa0 name Existed_Raid, state configuring 00:21:26.022 09:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:26.280 [2024-07-15 09:25:35.144163] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:26.280 [2024-07-15 09:25:35.144199] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:26.280 [2024-07-15 09:25:35.144208] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:26.280 [2024-07-15 09:25:35.144219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:26.280 [2024-07-15 09:25:35.144228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:26.280 [2024-07-15 09:25:35.144239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:26.280 [2024-07-15 09:25:35.144248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:26.280 [2024-07-15 09:25:35.144258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:26.280 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:26.539 [2024-07-15 09:25:35.402674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:26.539 BaseBdev1 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:26.539 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:26.798 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:27.057 [ 00:21:27.057 { 00:21:27.057 "name": "BaseBdev1", 00:21:27.057 "aliases": [ 00:21:27.057 "cb847b41-9cee-42ca-96ac-11fc23a0b1d9" 00:21:27.057 ], 00:21:27.057 "product_name": "Malloc disk", 00:21:27.057 "block_size": 512, 00:21:27.057 "num_blocks": 65536, 00:21:27.057 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:27.057 "assigned_rate_limits": { 00:21:27.057 "rw_ios_per_sec": 0, 00:21:27.057 "rw_mbytes_per_sec": 0, 00:21:27.057 "r_mbytes_per_sec": 0, 00:21:27.057 "w_mbytes_per_sec": 0 00:21:27.057 }, 00:21:27.057 "claimed": true, 00:21:27.057 "claim_type": "exclusive_write", 00:21:27.057 "zoned": false, 00:21:27.057 "supported_io_types": { 00:21:27.057 "read": true, 00:21:27.057 "write": true, 00:21:27.057 "unmap": true, 00:21:27.057 "flush": true, 00:21:27.057 "reset": true, 00:21:27.057 "nvme_admin": false, 00:21:27.057 "nvme_io": false, 00:21:27.058 "nvme_io_md": false, 00:21:27.058 "write_zeroes": true, 00:21:27.058 "zcopy": true, 00:21:27.058 "get_zone_info": false, 00:21:27.058 "zone_management": false, 00:21:27.058 "zone_append": false, 00:21:27.058 "compare": false, 00:21:27.058 "compare_and_write": false, 00:21:27.058 "abort": true, 00:21:27.058 "seek_hole": false, 00:21:27.058 "seek_data": false, 00:21:27.058 "copy": true, 00:21:27.058 "nvme_iov_md": false 00:21:27.058 }, 00:21:27.058 "memory_domains": [ 00:21:27.058 { 00:21:27.058 "dma_device_id": "system", 00:21:27.058 "dma_device_type": 1 00:21:27.058 }, 00:21:27.058 { 00:21:27.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.058 "dma_device_type": 2 00:21:27.058 } 00:21:27.058 ], 00:21:27.058 "driver_specific": {} 00:21:27.058 } 00:21:27.058 ] 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.058 09:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.317 09:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.317 "name": "Existed_Raid", 00:21:27.317 "uuid": "17201e20-ceac-4f8f-8c0d-855447484437", 00:21:27.317 "strip_size_kb": 0, 00:21:27.317 "state": "configuring", 00:21:27.317 "raid_level": "raid1", 00:21:27.317 "superblock": true, 00:21:27.317 "num_base_bdevs": 4, 00:21:27.317 "num_base_bdevs_discovered": 1, 00:21:27.317 "num_base_bdevs_operational": 4, 00:21:27.317 "base_bdevs_list": [ 00:21:27.317 { 00:21:27.317 "name": "BaseBdev1", 00:21:27.317 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:27.317 "is_configured": true, 00:21:27.317 "data_offset": 2048, 00:21:27.317 "data_size": 63488 00:21:27.317 }, 00:21:27.317 { 00:21:27.317 "name": "BaseBdev2", 00:21:27.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.318 "is_configured": false, 00:21:27.318 "data_offset": 0, 00:21:27.318 "data_size": 0 00:21:27.318 }, 00:21:27.318 { 00:21:27.318 "name": "BaseBdev3", 00:21:27.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.318 "is_configured": false, 00:21:27.318 "data_offset": 0, 00:21:27.318 "data_size": 0 00:21:27.318 }, 00:21:27.318 { 00:21:27.318 "name": "BaseBdev4", 00:21:27.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.318 "is_configured": false, 00:21:27.318 "data_offset": 0, 00:21:27.318 "data_size": 0 00:21:27.318 } 00:21:27.318 ] 00:21:27.318 }' 00:21:27.318 09:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.318 09:25:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.885 09:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:28.143 [2024-07-15 09:25:36.990888] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:28.143 [2024-07-15 09:25:36.990938] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130f310 name Existed_Raid, state configuring 00:21:28.143 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:28.403 [2024-07-15 09:25:37.239588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.403 [2024-07-15 09:25:37.241040] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:28.403 [2024-07-15 09:25:37.241074] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:28.403 [2024-07-15 09:25:37.241084] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:28.403 [2024-07-15 09:25:37.241095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:28.403 [2024-07-15 09:25:37.241104] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:28.403 [2024-07-15 09:25:37.241115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.403 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:28.662 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.662 "name": "Existed_Raid", 00:21:28.662 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:28.662 "strip_size_kb": 0, 00:21:28.662 "state": "configuring", 00:21:28.662 "raid_level": "raid1", 00:21:28.662 "superblock": true, 00:21:28.662 "num_base_bdevs": 4, 00:21:28.662 "num_base_bdevs_discovered": 1, 00:21:28.662 "num_base_bdevs_operational": 4, 00:21:28.662 "base_bdevs_list": [ 00:21:28.662 { 00:21:28.662 "name": "BaseBdev1", 00:21:28.662 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:28.662 "is_configured": true, 00:21:28.662 "data_offset": 2048, 00:21:28.662 "data_size": 63488 00:21:28.662 }, 00:21:28.662 { 00:21:28.662 "name": "BaseBdev2", 00:21:28.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.662 "is_configured": false, 00:21:28.662 "data_offset": 0, 00:21:28.662 "data_size": 0 00:21:28.662 }, 00:21:28.662 { 00:21:28.662 "name": "BaseBdev3", 00:21:28.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.662 "is_configured": false, 00:21:28.662 "data_offset": 0, 00:21:28.662 "data_size": 0 00:21:28.662 }, 00:21:28.662 { 00:21:28.662 "name": "BaseBdev4", 00:21:28.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.662 "is_configured": false, 00:21:28.662 "data_offset": 0, 00:21:28.662 "data_size": 0 00:21:28.662 } 00:21:28.662 ] 00:21:28.662 }' 00:21:28.662 09:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.662 09:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:29.230 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:29.489 [2024-07-15 09:25:38.370062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:29.489 BaseBdev2 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:29.490 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:29.748 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:30.008 [ 00:21:30.008 { 00:21:30.008 "name": "BaseBdev2", 00:21:30.008 "aliases": [ 00:21:30.008 "f7aa41f0-e36b-4e67-933f-04bcea748fcd" 00:21:30.008 ], 00:21:30.008 "product_name": "Malloc disk", 00:21:30.008 "block_size": 512, 00:21:30.008 "num_blocks": 65536, 00:21:30.008 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:30.008 "assigned_rate_limits": { 00:21:30.008 "rw_ios_per_sec": 0, 00:21:30.008 "rw_mbytes_per_sec": 0, 00:21:30.008 "r_mbytes_per_sec": 0, 00:21:30.008 "w_mbytes_per_sec": 0 00:21:30.008 }, 00:21:30.008 "claimed": true, 00:21:30.008 "claim_type": "exclusive_write", 00:21:30.008 "zoned": false, 00:21:30.008 "supported_io_types": { 00:21:30.008 "read": true, 00:21:30.008 "write": true, 00:21:30.008 "unmap": true, 00:21:30.008 "flush": true, 00:21:30.008 "reset": true, 00:21:30.008 "nvme_admin": false, 00:21:30.008 "nvme_io": false, 00:21:30.008 "nvme_io_md": false, 00:21:30.008 "write_zeroes": true, 00:21:30.008 "zcopy": true, 00:21:30.008 "get_zone_info": false, 00:21:30.008 "zone_management": false, 00:21:30.008 "zone_append": false, 00:21:30.008 "compare": false, 00:21:30.008 "compare_and_write": false, 00:21:30.008 "abort": true, 00:21:30.008 "seek_hole": false, 00:21:30.008 "seek_data": false, 00:21:30.008 "copy": true, 00:21:30.008 "nvme_iov_md": false 00:21:30.008 }, 00:21:30.008 "memory_domains": [ 00:21:30.008 { 00:21:30.008 "dma_device_id": "system", 00:21:30.008 "dma_device_type": 1 00:21:30.008 }, 00:21:30.008 { 00:21:30.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.008 "dma_device_type": 2 00:21:30.008 } 00:21:30.008 ], 00:21:30.008 "driver_specific": {} 00:21:30.008 } 00:21:30.008 ] 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.008 09:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.267 09:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.267 "name": "Existed_Raid", 00:21:30.267 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:30.267 "strip_size_kb": 0, 00:21:30.267 "state": "configuring", 00:21:30.267 "raid_level": "raid1", 00:21:30.267 "superblock": true, 00:21:30.267 "num_base_bdevs": 4, 00:21:30.267 "num_base_bdevs_discovered": 2, 00:21:30.267 "num_base_bdevs_operational": 4, 00:21:30.267 "base_bdevs_list": [ 00:21:30.267 { 00:21:30.267 "name": "BaseBdev1", 00:21:30.267 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:30.267 "is_configured": true, 00:21:30.267 "data_offset": 2048, 00:21:30.267 "data_size": 63488 00:21:30.267 }, 00:21:30.267 { 00:21:30.267 "name": "BaseBdev2", 00:21:30.267 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:30.267 "is_configured": true, 00:21:30.267 "data_offset": 2048, 00:21:30.267 "data_size": 63488 00:21:30.267 }, 00:21:30.267 { 00:21:30.267 "name": "BaseBdev3", 00:21:30.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.267 "is_configured": false, 00:21:30.267 "data_offset": 0, 00:21:30.267 "data_size": 0 00:21:30.267 }, 00:21:30.267 { 00:21:30.267 "name": "BaseBdev4", 00:21:30.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.267 "is_configured": false, 00:21:30.267 "data_offset": 0, 00:21:30.267 "data_size": 0 00:21:30.267 } 00:21:30.267 ] 00:21:30.267 }' 00:21:30.267 09:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.267 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:30.835 09:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:31.094 [2024-07-15 09:25:39.949746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:31.094 BaseBdev3 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:31.094 09:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:31.353 09:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:31.611 [ 00:21:31.611 { 00:21:31.611 "name": "BaseBdev3", 00:21:31.611 "aliases": [ 00:21:31.611 "a6394935-343d-43ad-9ef5-c6a1fc805dab" 00:21:31.611 ], 00:21:31.611 "product_name": "Malloc disk", 00:21:31.611 "block_size": 512, 00:21:31.611 "num_blocks": 65536, 00:21:31.611 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:31.611 "assigned_rate_limits": { 00:21:31.611 "rw_ios_per_sec": 0, 00:21:31.611 "rw_mbytes_per_sec": 0, 00:21:31.611 "r_mbytes_per_sec": 0, 00:21:31.611 "w_mbytes_per_sec": 0 00:21:31.611 }, 00:21:31.611 "claimed": true, 00:21:31.611 "claim_type": "exclusive_write", 00:21:31.611 "zoned": false, 00:21:31.611 "supported_io_types": { 00:21:31.611 "read": true, 00:21:31.611 "write": true, 00:21:31.611 "unmap": true, 00:21:31.611 "flush": true, 00:21:31.611 "reset": true, 00:21:31.611 "nvme_admin": false, 00:21:31.611 "nvme_io": false, 00:21:31.611 "nvme_io_md": false, 00:21:31.611 "write_zeroes": true, 00:21:31.611 "zcopy": true, 00:21:31.611 "get_zone_info": false, 00:21:31.611 "zone_management": false, 00:21:31.611 "zone_append": false, 00:21:31.611 "compare": false, 00:21:31.611 "compare_and_write": false, 00:21:31.611 "abort": true, 00:21:31.611 "seek_hole": false, 00:21:31.611 "seek_data": false, 00:21:31.611 "copy": true, 00:21:31.611 "nvme_iov_md": false 00:21:31.611 }, 00:21:31.611 "memory_domains": [ 00:21:31.611 { 00:21:31.611 "dma_device_id": "system", 00:21:31.611 "dma_device_type": 1 00:21:31.611 }, 00:21:31.611 { 00:21:31.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.611 "dma_device_type": 2 00:21:31.611 } 00:21:31.611 ], 00:21:31.611 "driver_specific": {} 00:21:31.611 } 00:21:31.611 ] 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.611 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:31.869 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.869 "name": "Existed_Raid", 00:21:31.869 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:31.869 "strip_size_kb": 0, 00:21:31.869 "state": "configuring", 00:21:31.869 "raid_level": "raid1", 00:21:31.869 "superblock": true, 00:21:31.869 "num_base_bdevs": 4, 00:21:31.869 "num_base_bdevs_discovered": 3, 00:21:31.869 "num_base_bdevs_operational": 4, 00:21:31.869 "base_bdevs_list": [ 00:21:31.869 { 00:21:31.869 "name": "BaseBdev1", 00:21:31.869 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:31.869 "is_configured": true, 00:21:31.869 "data_offset": 2048, 00:21:31.869 "data_size": 63488 00:21:31.869 }, 00:21:31.869 { 00:21:31.869 "name": "BaseBdev2", 00:21:31.869 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:31.869 "is_configured": true, 00:21:31.869 "data_offset": 2048, 00:21:31.869 "data_size": 63488 00:21:31.869 }, 00:21:31.869 { 00:21:31.869 "name": "BaseBdev3", 00:21:31.869 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:31.869 "is_configured": true, 00:21:31.869 "data_offset": 2048, 00:21:31.869 "data_size": 63488 00:21:31.869 }, 00:21:31.869 { 00:21:31.869 "name": "BaseBdev4", 00:21:31.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.869 "is_configured": false, 00:21:31.869 "data_offset": 0, 00:21:31.869 "data_size": 0 00:21:31.869 } 00:21:31.869 ] 00:21:31.869 }' 00:21:31.869 09:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.869 09:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:32.436 09:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:32.695 [2024-07-15 09:25:41.569430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:32.695 [2024-07-15 09:25:41.569607] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1310350 00:21:32.695 [2024-07-15 09:25:41.569621] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:32.695 [2024-07-15 09:25:41.569795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1310020 00:21:32.695 [2024-07-15 09:25:41.569921] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1310350 00:21:32.695 [2024-07-15 09:25:41.569941] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1310350 00:21:32.695 [2024-07-15 09:25:41.570036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.695 BaseBdev4 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:32.695 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:32.953 09:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:33.212 [ 00:21:33.212 { 00:21:33.212 "name": "BaseBdev4", 00:21:33.212 "aliases": [ 00:21:33.212 "845b5f00-9370-440f-8f0b-40399e6a3628" 00:21:33.212 ], 00:21:33.212 "product_name": "Malloc disk", 00:21:33.212 "block_size": 512, 00:21:33.212 "num_blocks": 65536, 00:21:33.212 "uuid": "845b5f00-9370-440f-8f0b-40399e6a3628", 00:21:33.212 "assigned_rate_limits": { 00:21:33.212 "rw_ios_per_sec": 0, 00:21:33.212 "rw_mbytes_per_sec": 0, 00:21:33.212 "r_mbytes_per_sec": 0, 00:21:33.212 "w_mbytes_per_sec": 0 00:21:33.212 }, 00:21:33.212 "claimed": true, 00:21:33.212 "claim_type": "exclusive_write", 00:21:33.212 "zoned": false, 00:21:33.212 "supported_io_types": { 00:21:33.212 "read": true, 00:21:33.212 "write": true, 00:21:33.212 "unmap": true, 00:21:33.212 "flush": true, 00:21:33.212 "reset": true, 00:21:33.212 "nvme_admin": false, 00:21:33.212 "nvme_io": false, 00:21:33.212 "nvme_io_md": false, 00:21:33.212 "write_zeroes": true, 00:21:33.212 "zcopy": true, 00:21:33.212 "get_zone_info": false, 00:21:33.212 "zone_management": false, 00:21:33.212 "zone_append": false, 00:21:33.212 "compare": false, 00:21:33.212 "compare_and_write": false, 00:21:33.212 "abort": true, 00:21:33.212 "seek_hole": false, 00:21:33.212 "seek_data": false, 00:21:33.212 "copy": true, 00:21:33.212 "nvme_iov_md": false 00:21:33.212 }, 00:21:33.212 "memory_domains": [ 00:21:33.212 { 00:21:33.212 "dma_device_id": "system", 00:21:33.212 "dma_device_type": 1 00:21:33.212 }, 00:21:33.212 { 00:21:33.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.212 "dma_device_type": 2 00:21:33.212 } 00:21:33.212 ], 00:21:33.212 "driver_specific": {} 00:21:33.212 } 00:21:33.212 ] 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.212 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.471 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.471 "name": "Existed_Raid", 00:21:33.471 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:33.471 "strip_size_kb": 0, 00:21:33.471 "state": "online", 00:21:33.471 "raid_level": "raid1", 00:21:33.471 "superblock": true, 00:21:33.471 "num_base_bdevs": 4, 00:21:33.471 "num_base_bdevs_discovered": 4, 00:21:33.471 "num_base_bdevs_operational": 4, 00:21:33.471 "base_bdevs_list": [ 00:21:33.471 { 00:21:33.471 "name": "BaseBdev1", 00:21:33.471 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:33.471 "is_configured": true, 00:21:33.471 "data_offset": 2048, 00:21:33.471 "data_size": 63488 00:21:33.471 }, 00:21:33.471 { 00:21:33.471 "name": "BaseBdev2", 00:21:33.471 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:33.471 "is_configured": true, 00:21:33.471 "data_offset": 2048, 00:21:33.471 "data_size": 63488 00:21:33.471 }, 00:21:33.471 { 00:21:33.471 "name": "BaseBdev3", 00:21:33.471 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:33.471 "is_configured": true, 00:21:33.471 "data_offset": 2048, 00:21:33.471 "data_size": 63488 00:21:33.471 }, 00:21:33.471 { 00:21:33.471 "name": "BaseBdev4", 00:21:33.471 "uuid": "845b5f00-9370-440f-8f0b-40399e6a3628", 00:21:33.471 "is_configured": true, 00:21:33.471 "data_offset": 2048, 00:21:33.471 "data_size": 63488 00:21:33.471 } 00:21:33.471 ] 00:21:33.471 }' 00:21:33.471 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.471 09:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:34.037 09:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:34.330 [2024-07-15 09:25:43.049676] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:34.330 "name": "Existed_Raid", 00:21:34.330 "aliases": [ 00:21:34.330 "bc277165-1810-432e-8126-b158e0ee7263" 00:21:34.330 ], 00:21:34.330 "product_name": "Raid Volume", 00:21:34.330 "block_size": 512, 00:21:34.330 "num_blocks": 63488, 00:21:34.330 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:34.330 "assigned_rate_limits": { 00:21:34.330 "rw_ios_per_sec": 0, 00:21:34.330 "rw_mbytes_per_sec": 0, 00:21:34.330 "r_mbytes_per_sec": 0, 00:21:34.330 "w_mbytes_per_sec": 0 00:21:34.330 }, 00:21:34.330 "claimed": false, 00:21:34.330 "zoned": false, 00:21:34.330 "supported_io_types": { 00:21:34.330 "read": true, 00:21:34.330 "write": true, 00:21:34.330 "unmap": false, 00:21:34.330 "flush": false, 00:21:34.330 "reset": true, 00:21:34.330 "nvme_admin": false, 00:21:34.330 "nvme_io": false, 00:21:34.330 "nvme_io_md": false, 00:21:34.330 "write_zeroes": true, 00:21:34.330 "zcopy": false, 00:21:34.330 "get_zone_info": false, 00:21:34.330 "zone_management": false, 00:21:34.330 "zone_append": false, 00:21:34.330 "compare": false, 00:21:34.330 "compare_and_write": false, 00:21:34.330 "abort": false, 00:21:34.330 "seek_hole": false, 00:21:34.330 "seek_data": false, 00:21:34.330 "copy": false, 00:21:34.330 "nvme_iov_md": false 00:21:34.330 }, 00:21:34.330 "memory_domains": [ 00:21:34.330 { 00:21:34.330 "dma_device_id": "system", 00:21:34.330 "dma_device_type": 1 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.330 "dma_device_type": 2 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "system", 00:21:34.330 "dma_device_type": 1 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.330 "dma_device_type": 2 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "system", 00:21:34.330 "dma_device_type": 1 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.330 "dma_device_type": 2 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "system", 00:21:34.330 "dma_device_type": 1 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.330 "dma_device_type": 2 00:21:34.330 } 00:21:34.330 ], 00:21:34.330 "driver_specific": { 00:21:34.330 "raid": { 00:21:34.330 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:34.330 "strip_size_kb": 0, 00:21:34.330 "state": "online", 00:21:34.330 "raid_level": "raid1", 00:21:34.330 "superblock": true, 00:21:34.330 "num_base_bdevs": 4, 00:21:34.330 "num_base_bdevs_discovered": 4, 00:21:34.330 "num_base_bdevs_operational": 4, 00:21:34.330 "base_bdevs_list": [ 00:21:34.330 { 00:21:34.330 "name": "BaseBdev1", 00:21:34.330 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:34.330 "is_configured": true, 00:21:34.330 "data_offset": 2048, 00:21:34.330 "data_size": 63488 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "name": "BaseBdev2", 00:21:34.330 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:34.330 "is_configured": true, 00:21:34.330 "data_offset": 2048, 00:21:34.330 "data_size": 63488 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "name": "BaseBdev3", 00:21:34.330 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:34.330 "is_configured": true, 00:21:34.330 "data_offset": 2048, 00:21:34.330 "data_size": 63488 00:21:34.330 }, 00:21:34.330 { 00:21:34.330 "name": "BaseBdev4", 00:21:34.330 "uuid": "845b5f00-9370-440f-8f0b-40399e6a3628", 00:21:34.330 "is_configured": true, 00:21:34.330 "data_offset": 2048, 00:21:34.330 "data_size": 63488 00:21:34.330 } 00:21:34.330 ] 00:21:34.330 } 00:21:34.330 } 00:21:34.330 }' 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:34.330 BaseBdev2 00:21:34.330 BaseBdev3 00:21:34.330 BaseBdev4' 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:34.330 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:34.588 "name": "BaseBdev1", 00:21:34.588 "aliases": [ 00:21:34.588 "cb847b41-9cee-42ca-96ac-11fc23a0b1d9" 00:21:34.588 ], 00:21:34.588 "product_name": "Malloc disk", 00:21:34.588 "block_size": 512, 00:21:34.588 "num_blocks": 65536, 00:21:34.588 "uuid": "cb847b41-9cee-42ca-96ac-11fc23a0b1d9", 00:21:34.588 "assigned_rate_limits": { 00:21:34.588 "rw_ios_per_sec": 0, 00:21:34.588 "rw_mbytes_per_sec": 0, 00:21:34.588 "r_mbytes_per_sec": 0, 00:21:34.588 "w_mbytes_per_sec": 0 00:21:34.588 }, 00:21:34.588 "claimed": true, 00:21:34.588 "claim_type": "exclusive_write", 00:21:34.588 "zoned": false, 00:21:34.588 "supported_io_types": { 00:21:34.588 "read": true, 00:21:34.588 "write": true, 00:21:34.588 "unmap": true, 00:21:34.588 "flush": true, 00:21:34.588 "reset": true, 00:21:34.588 "nvme_admin": false, 00:21:34.588 "nvme_io": false, 00:21:34.588 "nvme_io_md": false, 00:21:34.588 "write_zeroes": true, 00:21:34.588 "zcopy": true, 00:21:34.588 "get_zone_info": false, 00:21:34.588 "zone_management": false, 00:21:34.588 "zone_append": false, 00:21:34.588 "compare": false, 00:21:34.588 "compare_and_write": false, 00:21:34.588 "abort": true, 00:21:34.588 "seek_hole": false, 00:21:34.588 "seek_data": false, 00:21:34.588 "copy": true, 00:21:34.588 "nvme_iov_md": false 00:21:34.588 }, 00:21:34.588 "memory_domains": [ 00:21:34.588 { 00:21:34.588 "dma_device_id": "system", 00:21:34.588 "dma_device_type": 1 00:21:34.588 }, 00:21:34.588 { 00:21:34.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.588 "dma_device_type": 2 00:21:34.588 } 00:21:34.588 ], 00:21:34.588 "driver_specific": {} 00:21:34.588 }' 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.588 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:34.846 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.104 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.104 "name": "BaseBdev2", 00:21:35.104 "aliases": [ 00:21:35.104 "f7aa41f0-e36b-4e67-933f-04bcea748fcd" 00:21:35.104 ], 00:21:35.104 "product_name": "Malloc disk", 00:21:35.104 "block_size": 512, 00:21:35.104 "num_blocks": 65536, 00:21:35.104 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:35.104 "assigned_rate_limits": { 00:21:35.104 "rw_ios_per_sec": 0, 00:21:35.104 "rw_mbytes_per_sec": 0, 00:21:35.104 "r_mbytes_per_sec": 0, 00:21:35.104 "w_mbytes_per_sec": 0 00:21:35.104 }, 00:21:35.104 "claimed": true, 00:21:35.104 "claim_type": "exclusive_write", 00:21:35.104 "zoned": false, 00:21:35.104 "supported_io_types": { 00:21:35.104 "read": true, 00:21:35.104 "write": true, 00:21:35.104 "unmap": true, 00:21:35.104 "flush": true, 00:21:35.104 "reset": true, 00:21:35.104 "nvme_admin": false, 00:21:35.104 "nvme_io": false, 00:21:35.104 "nvme_io_md": false, 00:21:35.104 "write_zeroes": true, 00:21:35.104 "zcopy": true, 00:21:35.104 "get_zone_info": false, 00:21:35.104 "zone_management": false, 00:21:35.104 "zone_append": false, 00:21:35.104 "compare": false, 00:21:35.104 "compare_and_write": false, 00:21:35.104 "abort": true, 00:21:35.104 "seek_hole": false, 00:21:35.104 "seek_data": false, 00:21:35.104 "copy": true, 00:21:35.104 "nvme_iov_md": false 00:21:35.104 }, 00:21:35.104 "memory_domains": [ 00:21:35.104 { 00:21:35.104 "dma_device_id": "system", 00:21:35.104 "dma_device_type": 1 00:21:35.104 }, 00:21:35.104 { 00:21:35.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.104 "dma_device_type": 2 00:21:35.104 } 00:21:35.104 ], 00:21:35.104 "driver_specific": {} 00:21:35.104 }' 00:21:35.104 09:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.104 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.104 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.361 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.619 "name": "BaseBdev3", 00:21:35.619 "aliases": [ 00:21:35.619 "a6394935-343d-43ad-9ef5-c6a1fc805dab" 00:21:35.619 ], 00:21:35.619 "product_name": "Malloc disk", 00:21:35.619 "block_size": 512, 00:21:35.619 "num_blocks": 65536, 00:21:35.619 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:35.619 "assigned_rate_limits": { 00:21:35.619 "rw_ios_per_sec": 0, 00:21:35.619 "rw_mbytes_per_sec": 0, 00:21:35.619 "r_mbytes_per_sec": 0, 00:21:35.619 "w_mbytes_per_sec": 0 00:21:35.619 }, 00:21:35.619 "claimed": true, 00:21:35.619 "claim_type": "exclusive_write", 00:21:35.619 "zoned": false, 00:21:35.619 "supported_io_types": { 00:21:35.619 "read": true, 00:21:35.619 "write": true, 00:21:35.619 "unmap": true, 00:21:35.619 "flush": true, 00:21:35.619 "reset": true, 00:21:35.619 "nvme_admin": false, 00:21:35.619 "nvme_io": false, 00:21:35.619 "nvme_io_md": false, 00:21:35.619 "write_zeroes": true, 00:21:35.619 "zcopy": true, 00:21:35.619 "get_zone_info": false, 00:21:35.619 "zone_management": false, 00:21:35.619 "zone_append": false, 00:21:35.619 "compare": false, 00:21:35.619 "compare_and_write": false, 00:21:35.619 "abort": true, 00:21:35.619 "seek_hole": false, 00:21:35.619 "seek_data": false, 00:21:35.619 "copy": true, 00:21:35.619 "nvme_iov_md": false 00:21:35.619 }, 00:21:35.619 "memory_domains": [ 00:21:35.619 { 00:21:35.619 "dma_device_id": "system", 00:21:35.619 "dma_device_type": 1 00:21:35.619 }, 00:21:35.619 { 00:21:35.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.619 "dma_device_type": 2 00:21:35.619 } 00:21:35.619 ], 00:21:35.619 "driver_specific": {} 00:21:35.619 }' 00:21:35.619 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.877 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.136 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.136 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.136 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:36.136 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:36.136 09:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:36.394 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:36.394 "name": "BaseBdev4", 00:21:36.394 "aliases": [ 00:21:36.394 "845b5f00-9370-440f-8f0b-40399e6a3628" 00:21:36.394 ], 00:21:36.394 "product_name": "Malloc disk", 00:21:36.394 "block_size": 512, 00:21:36.394 "num_blocks": 65536, 00:21:36.394 "uuid": "845b5f00-9370-440f-8f0b-40399e6a3628", 00:21:36.394 "assigned_rate_limits": { 00:21:36.394 "rw_ios_per_sec": 0, 00:21:36.394 "rw_mbytes_per_sec": 0, 00:21:36.394 "r_mbytes_per_sec": 0, 00:21:36.394 "w_mbytes_per_sec": 0 00:21:36.394 }, 00:21:36.394 "claimed": true, 00:21:36.394 "claim_type": "exclusive_write", 00:21:36.394 "zoned": false, 00:21:36.394 "supported_io_types": { 00:21:36.394 "read": true, 00:21:36.394 "write": true, 00:21:36.394 "unmap": true, 00:21:36.395 "flush": true, 00:21:36.395 "reset": true, 00:21:36.395 "nvme_admin": false, 00:21:36.395 "nvme_io": false, 00:21:36.395 "nvme_io_md": false, 00:21:36.395 "write_zeroes": true, 00:21:36.395 "zcopy": true, 00:21:36.395 "get_zone_info": false, 00:21:36.395 "zone_management": false, 00:21:36.395 "zone_append": false, 00:21:36.395 "compare": false, 00:21:36.395 "compare_and_write": false, 00:21:36.395 "abort": true, 00:21:36.395 "seek_hole": false, 00:21:36.395 "seek_data": false, 00:21:36.395 "copy": true, 00:21:36.395 "nvme_iov_md": false 00:21:36.395 }, 00:21:36.395 "memory_domains": [ 00:21:36.395 { 00:21:36.395 "dma_device_id": "system", 00:21:36.395 "dma_device_type": 1 00:21:36.395 }, 00:21:36.395 { 00:21:36.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.395 "dma_device_type": 2 00:21:36.395 } 00:21:36.395 ], 00:21:36.395 "driver_specific": {} 00:21:36.395 }' 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:36.395 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.653 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:36.912 [2024-07-15 09:25:45.632244] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.912 "name": "Existed_Raid", 00:21:36.912 "uuid": "bc277165-1810-432e-8126-b158e0ee7263", 00:21:36.912 "strip_size_kb": 0, 00:21:36.912 "state": "online", 00:21:36.912 "raid_level": "raid1", 00:21:36.912 "superblock": true, 00:21:36.912 "num_base_bdevs": 4, 00:21:36.912 "num_base_bdevs_discovered": 3, 00:21:36.912 "num_base_bdevs_operational": 3, 00:21:36.912 "base_bdevs_list": [ 00:21:36.912 { 00:21:36.912 "name": null, 00:21:36.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.912 "is_configured": false, 00:21:36.912 "data_offset": 2048, 00:21:36.912 "data_size": 63488 00:21:36.912 }, 00:21:36.912 { 00:21:36.912 "name": "BaseBdev2", 00:21:36.912 "uuid": "f7aa41f0-e36b-4e67-933f-04bcea748fcd", 00:21:36.912 "is_configured": true, 00:21:36.912 "data_offset": 2048, 00:21:36.912 "data_size": 63488 00:21:36.912 }, 00:21:36.912 { 00:21:36.912 "name": "BaseBdev3", 00:21:36.912 "uuid": "a6394935-343d-43ad-9ef5-c6a1fc805dab", 00:21:36.912 "is_configured": true, 00:21:36.912 "data_offset": 2048, 00:21:36.912 "data_size": 63488 00:21:36.912 }, 00:21:36.912 { 00:21:36.912 "name": "BaseBdev4", 00:21:36.912 "uuid": "845b5f00-9370-440f-8f0b-40399e6a3628", 00:21:36.912 "is_configured": true, 00:21:36.912 "data_offset": 2048, 00:21:36.912 "data_size": 63488 00:21:36.912 } 00:21:36.912 ] 00:21:36.912 }' 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.912 09:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:37.479 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:37.479 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:37.479 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.479 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:37.737 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:37.737 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:37.737 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:37.995 [2024-07-15 09:25:46.804466] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:37.995 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:37.995 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:37.995 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.995 09:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:38.253 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:38.253 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:38.253 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:38.511 [2024-07-15 09:25:47.306359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:38.511 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:38.511 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:38.511 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.511 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:38.769 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:38.769 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:38.769 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:39.028 [2024-07-15 09:25:47.816247] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:39.028 [2024-07-15 09:25:47.816375] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:39.028 [2024-07-15 09:25:47.839695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:39.028 [2024-07-15 09:25:47.839735] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:39.028 [2024-07-15 09:25:47.839747] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1310350 name Existed_Raid, state offline 00:21:39.028 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:39.028 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:39.028 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.028 09:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:39.286 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:39.545 BaseBdev2 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:39.545 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:39.803 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:40.061 [ 00:21:40.061 { 00:21:40.061 "name": "BaseBdev2", 00:21:40.061 "aliases": [ 00:21:40.061 "4621af3e-ea64-4a16-bdc5-f8ae3185654b" 00:21:40.061 ], 00:21:40.061 "product_name": "Malloc disk", 00:21:40.061 "block_size": 512, 00:21:40.061 "num_blocks": 65536, 00:21:40.061 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:40.061 "assigned_rate_limits": { 00:21:40.061 "rw_ios_per_sec": 0, 00:21:40.061 "rw_mbytes_per_sec": 0, 00:21:40.061 "r_mbytes_per_sec": 0, 00:21:40.061 "w_mbytes_per_sec": 0 00:21:40.061 }, 00:21:40.061 "claimed": false, 00:21:40.061 "zoned": false, 00:21:40.062 "supported_io_types": { 00:21:40.062 "read": true, 00:21:40.062 "write": true, 00:21:40.062 "unmap": true, 00:21:40.062 "flush": true, 00:21:40.062 "reset": true, 00:21:40.062 "nvme_admin": false, 00:21:40.062 "nvme_io": false, 00:21:40.062 "nvme_io_md": false, 00:21:40.062 "write_zeroes": true, 00:21:40.062 "zcopy": true, 00:21:40.062 "get_zone_info": false, 00:21:40.062 "zone_management": false, 00:21:40.062 "zone_append": false, 00:21:40.062 "compare": false, 00:21:40.062 "compare_and_write": false, 00:21:40.062 "abort": true, 00:21:40.062 "seek_hole": false, 00:21:40.062 "seek_data": false, 00:21:40.062 "copy": true, 00:21:40.062 "nvme_iov_md": false 00:21:40.062 }, 00:21:40.062 "memory_domains": [ 00:21:40.062 { 00:21:40.062 "dma_device_id": "system", 00:21:40.062 "dma_device_type": 1 00:21:40.062 }, 00:21:40.062 { 00:21:40.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:40.062 "dma_device_type": 2 00:21:40.062 } 00:21:40.062 ], 00:21:40.062 "driver_specific": {} 00:21:40.062 } 00:21:40.062 ] 00:21:40.062 09:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:40.062 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:40.062 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:40.062 09:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:40.320 BaseBdev3 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:40.320 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:40.578 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:40.836 [ 00:21:40.836 { 00:21:40.836 "name": "BaseBdev3", 00:21:40.836 "aliases": [ 00:21:40.836 "a56bfba1-f98e-457c-91bb-71c2a4ac6246" 00:21:40.836 ], 00:21:40.836 "product_name": "Malloc disk", 00:21:40.836 "block_size": 512, 00:21:40.836 "num_blocks": 65536, 00:21:40.836 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:40.836 "assigned_rate_limits": { 00:21:40.836 "rw_ios_per_sec": 0, 00:21:40.836 "rw_mbytes_per_sec": 0, 00:21:40.836 "r_mbytes_per_sec": 0, 00:21:40.836 "w_mbytes_per_sec": 0 00:21:40.836 }, 00:21:40.836 "claimed": false, 00:21:40.836 "zoned": false, 00:21:40.836 "supported_io_types": { 00:21:40.836 "read": true, 00:21:40.836 "write": true, 00:21:40.836 "unmap": true, 00:21:40.836 "flush": true, 00:21:40.836 "reset": true, 00:21:40.836 "nvme_admin": false, 00:21:40.836 "nvme_io": false, 00:21:40.836 "nvme_io_md": false, 00:21:40.836 "write_zeroes": true, 00:21:40.836 "zcopy": true, 00:21:40.836 "get_zone_info": false, 00:21:40.836 "zone_management": false, 00:21:40.836 "zone_append": false, 00:21:40.836 "compare": false, 00:21:40.836 "compare_and_write": false, 00:21:40.836 "abort": true, 00:21:40.836 "seek_hole": false, 00:21:40.836 "seek_data": false, 00:21:40.836 "copy": true, 00:21:40.836 "nvme_iov_md": false 00:21:40.836 }, 00:21:40.836 "memory_domains": [ 00:21:40.836 { 00:21:40.836 "dma_device_id": "system", 00:21:40.836 "dma_device_type": 1 00:21:40.836 }, 00:21:40.836 { 00:21:40.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:40.836 "dma_device_type": 2 00:21:40.836 } 00:21:40.836 ], 00:21:40.836 "driver_specific": {} 00:21:40.836 } 00:21:40.836 ] 00:21:40.836 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:40.836 09:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:40.836 09:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:40.836 09:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:41.095 BaseBdev4 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:41.095 09:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:41.354 09:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:41.612 [ 00:21:41.612 { 00:21:41.612 "name": "BaseBdev4", 00:21:41.612 "aliases": [ 00:21:41.612 "057bc3e3-f719-42c9-80a3-e6f215e56d2f" 00:21:41.612 ], 00:21:41.612 "product_name": "Malloc disk", 00:21:41.612 "block_size": 512, 00:21:41.612 "num_blocks": 65536, 00:21:41.612 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:41.612 "assigned_rate_limits": { 00:21:41.612 "rw_ios_per_sec": 0, 00:21:41.612 "rw_mbytes_per_sec": 0, 00:21:41.612 "r_mbytes_per_sec": 0, 00:21:41.613 "w_mbytes_per_sec": 0 00:21:41.613 }, 00:21:41.613 "claimed": false, 00:21:41.613 "zoned": false, 00:21:41.613 "supported_io_types": { 00:21:41.613 "read": true, 00:21:41.613 "write": true, 00:21:41.613 "unmap": true, 00:21:41.613 "flush": true, 00:21:41.613 "reset": true, 00:21:41.613 "nvme_admin": false, 00:21:41.613 "nvme_io": false, 00:21:41.613 "nvme_io_md": false, 00:21:41.613 "write_zeroes": true, 00:21:41.613 "zcopy": true, 00:21:41.613 "get_zone_info": false, 00:21:41.613 "zone_management": false, 00:21:41.613 "zone_append": false, 00:21:41.613 "compare": false, 00:21:41.613 "compare_and_write": false, 00:21:41.613 "abort": true, 00:21:41.613 "seek_hole": false, 00:21:41.613 "seek_data": false, 00:21:41.613 "copy": true, 00:21:41.613 "nvme_iov_md": false 00:21:41.613 }, 00:21:41.613 "memory_domains": [ 00:21:41.613 { 00:21:41.613 "dma_device_id": "system", 00:21:41.613 "dma_device_type": 1 00:21:41.613 }, 00:21:41.613 { 00:21:41.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.613 "dma_device_type": 2 00:21:41.613 } 00:21:41.613 ], 00:21:41.613 "driver_specific": {} 00:21:41.613 } 00:21:41.613 ] 00:21:41.613 09:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:41.613 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:41.613 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:41.613 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:41.871 [2024-07-15 09:25:50.672225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:41.871 [2024-07-15 09:25:50.672269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:41.871 [2024-07-15 09:25:50.672291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:41.871 [2024-07-15 09:25:50.673782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:41.871 [2024-07-15 09:25:50.673827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.871 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.130 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.130 "name": "Existed_Raid", 00:21:42.130 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:42.130 "strip_size_kb": 0, 00:21:42.130 "state": "configuring", 00:21:42.130 "raid_level": "raid1", 00:21:42.130 "superblock": true, 00:21:42.130 "num_base_bdevs": 4, 00:21:42.130 "num_base_bdevs_discovered": 3, 00:21:42.130 "num_base_bdevs_operational": 4, 00:21:42.130 "base_bdevs_list": [ 00:21:42.130 { 00:21:42.130 "name": "BaseBdev1", 00:21:42.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.130 "is_configured": false, 00:21:42.130 "data_offset": 0, 00:21:42.130 "data_size": 0 00:21:42.130 }, 00:21:42.130 { 00:21:42.130 "name": "BaseBdev2", 00:21:42.130 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:42.130 "is_configured": true, 00:21:42.130 "data_offset": 2048, 00:21:42.130 "data_size": 63488 00:21:42.130 }, 00:21:42.130 { 00:21:42.130 "name": "BaseBdev3", 00:21:42.130 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:42.130 "is_configured": true, 00:21:42.130 "data_offset": 2048, 00:21:42.130 "data_size": 63488 00:21:42.130 }, 00:21:42.130 { 00:21:42.130 "name": "BaseBdev4", 00:21:42.130 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:42.130 "is_configured": true, 00:21:42.130 "data_offset": 2048, 00:21:42.130 "data_size": 63488 00:21:42.130 } 00:21:42.130 ] 00:21:42.130 }' 00:21:42.130 09:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.130 09:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:42.697 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:42.955 [2024-07-15 09:25:51.739018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.955 09:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.212 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.212 "name": "Existed_Raid", 00:21:43.212 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:43.212 "strip_size_kb": 0, 00:21:43.212 "state": "configuring", 00:21:43.212 "raid_level": "raid1", 00:21:43.212 "superblock": true, 00:21:43.212 "num_base_bdevs": 4, 00:21:43.212 "num_base_bdevs_discovered": 2, 00:21:43.212 "num_base_bdevs_operational": 4, 00:21:43.212 "base_bdevs_list": [ 00:21:43.213 { 00:21:43.213 "name": "BaseBdev1", 00:21:43.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.213 "is_configured": false, 00:21:43.213 "data_offset": 0, 00:21:43.213 "data_size": 0 00:21:43.213 }, 00:21:43.213 { 00:21:43.213 "name": null, 00:21:43.213 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:43.213 "is_configured": false, 00:21:43.213 "data_offset": 2048, 00:21:43.213 "data_size": 63488 00:21:43.213 }, 00:21:43.213 { 00:21:43.213 "name": "BaseBdev3", 00:21:43.213 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:43.213 "is_configured": true, 00:21:43.213 "data_offset": 2048, 00:21:43.213 "data_size": 63488 00:21:43.213 }, 00:21:43.213 { 00:21:43.213 "name": "BaseBdev4", 00:21:43.213 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:43.213 "is_configured": true, 00:21:43.213 "data_offset": 2048, 00:21:43.213 "data_size": 63488 00:21:43.213 } 00:21:43.213 ] 00:21:43.213 }' 00:21:43.213 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.213 09:25:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:43.777 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.777 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:44.034 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:44.034 09:25:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:44.290 [2024-07-15 09:25:53.103792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:44.290 BaseBdev1 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:44.290 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:44.291 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.547 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:44.805 [ 00:21:44.805 { 00:21:44.805 "name": "BaseBdev1", 00:21:44.805 "aliases": [ 00:21:44.805 "27e54397-59d8-4264-a442-9b2790b132ba" 00:21:44.805 ], 00:21:44.805 "product_name": "Malloc disk", 00:21:44.805 "block_size": 512, 00:21:44.805 "num_blocks": 65536, 00:21:44.805 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:44.805 "assigned_rate_limits": { 00:21:44.805 "rw_ios_per_sec": 0, 00:21:44.805 "rw_mbytes_per_sec": 0, 00:21:44.805 "r_mbytes_per_sec": 0, 00:21:44.805 "w_mbytes_per_sec": 0 00:21:44.805 }, 00:21:44.805 "claimed": true, 00:21:44.805 "claim_type": "exclusive_write", 00:21:44.805 "zoned": false, 00:21:44.805 "supported_io_types": { 00:21:44.805 "read": true, 00:21:44.805 "write": true, 00:21:44.805 "unmap": true, 00:21:44.805 "flush": true, 00:21:44.805 "reset": true, 00:21:44.805 "nvme_admin": false, 00:21:44.805 "nvme_io": false, 00:21:44.805 "nvme_io_md": false, 00:21:44.805 "write_zeroes": true, 00:21:44.805 "zcopy": true, 00:21:44.805 "get_zone_info": false, 00:21:44.805 "zone_management": false, 00:21:44.805 "zone_append": false, 00:21:44.805 "compare": false, 00:21:44.805 "compare_and_write": false, 00:21:44.805 "abort": true, 00:21:44.805 "seek_hole": false, 00:21:44.805 "seek_data": false, 00:21:44.805 "copy": true, 00:21:44.805 "nvme_iov_md": false 00:21:44.805 }, 00:21:44.805 "memory_domains": [ 00:21:44.805 { 00:21:44.805 "dma_device_id": "system", 00:21:44.805 "dma_device_type": 1 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.805 "dma_device_type": 2 00:21:44.805 } 00:21:44.805 ], 00:21:44.805 "driver_specific": {} 00:21:44.805 } 00:21:44.805 ] 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.805 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.062 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.062 "name": "Existed_Raid", 00:21:45.062 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:45.062 "strip_size_kb": 0, 00:21:45.062 "state": "configuring", 00:21:45.062 "raid_level": "raid1", 00:21:45.062 "superblock": true, 00:21:45.062 "num_base_bdevs": 4, 00:21:45.062 "num_base_bdevs_discovered": 3, 00:21:45.062 "num_base_bdevs_operational": 4, 00:21:45.062 "base_bdevs_list": [ 00:21:45.062 { 00:21:45.062 "name": "BaseBdev1", 00:21:45.062 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:45.062 "is_configured": true, 00:21:45.062 "data_offset": 2048, 00:21:45.062 "data_size": 63488 00:21:45.062 }, 00:21:45.062 { 00:21:45.062 "name": null, 00:21:45.062 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:45.062 "is_configured": false, 00:21:45.062 "data_offset": 2048, 00:21:45.062 "data_size": 63488 00:21:45.062 }, 00:21:45.062 { 00:21:45.062 "name": "BaseBdev3", 00:21:45.062 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:45.062 "is_configured": true, 00:21:45.062 "data_offset": 2048, 00:21:45.062 "data_size": 63488 00:21:45.062 }, 00:21:45.062 { 00:21:45.062 "name": "BaseBdev4", 00:21:45.062 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:45.062 "is_configured": true, 00:21:45.063 "data_offset": 2048, 00:21:45.063 "data_size": 63488 00:21:45.063 } 00:21:45.063 ] 00:21:45.063 }' 00:21:45.063 09:25:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.063 09:25:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.627 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.627 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:45.885 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:45.885 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:46.143 [2024-07-15 09:25:54.964730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.143 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.144 09:25:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.402 09:25:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.402 "name": "Existed_Raid", 00:21:46.402 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:46.402 "strip_size_kb": 0, 00:21:46.402 "state": "configuring", 00:21:46.402 "raid_level": "raid1", 00:21:46.402 "superblock": true, 00:21:46.402 "num_base_bdevs": 4, 00:21:46.402 "num_base_bdevs_discovered": 2, 00:21:46.402 "num_base_bdevs_operational": 4, 00:21:46.402 "base_bdevs_list": [ 00:21:46.402 { 00:21:46.402 "name": "BaseBdev1", 00:21:46.402 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:46.402 "is_configured": true, 00:21:46.402 "data_offset": 2048, 00:21:46.402 "data_size": 63488 00:21:46.402 }, 00:21:46.402 { 00:21:46.402 "name": null, 00:21:46.402 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:46.402 "is_configured": false, 00:21:46.402 "data_offset": 2048, 00:21:46.402 "data_size": 63488 00:21:46.402 }, 00:21:46.402 { 00:21:46.402 "name": null, 00:21:46.402 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:46.402 "is_configured": false, 00:21:46.402 "data_offset": 2048, 00:21:46.402 "data_size": 63488 00:21:46.402 }, 00:21:46.402 { 00:21:46.402 "name": "BaseBdev4", 00:21:46.402 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:46.402 "is_configured": true, 00:21:46.402 "data_offset": 2048, 00:21:46.402 "data_size": 63488 00:21:46.402 } 00:21:46.402 ] 00:21:46.402 }' 00:21:46.402 09:25:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.402 09:25:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:46.969 09:25:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.969 09:25:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:47.228 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:47.228 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:47.487 [2024-07-15 09:25:56.292279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.487 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.745 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.745 "name": "Existed_Raid", 00:21:47.745 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:47.745 "strip_size_kb": 0, 00:21:47.745 "state": "configuring", 00:21:47.745 "raid_level": "raid1", 00:21:47.745 "superblock": true, 00:21:47.745 "num_base_bdevs": 4, 00:21:47.745 "num_base_bdevs_discovered": 3, 00:21:47.745 "num_base_bdevs_operational": 4, 00:21:47.745 "base_bdevs_list": [ 00:21:47.745 { 00:21:47.745 "name": "BaseBdev1", 00:21:47.745 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:47.745 "is_configured": true, 00:21:47.745 "data_offset": 2048, 00:21:47.745 "data_size": 63488 00:21:47.745 }, 00:21:47.745 { 00:21:47.745 "name": null, 00:21:47.745 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:47.745 "is_configured": false, 00:21:47.745 "data_offset": 2048, 00:21:47.745 "data_size": 63488 00:21:47.745 }, 00:21:47.745 { 00:21:47.745 "name": "BaseBdev3", 00:21:47.745 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:47.745 "is_configured": true, 00:21:47.745 "data_offset": 2048, 00:21:47.745 "data_size": 63488 00:21:47.745 }, 00:21:47.745 { 00:21:47.745 "name": "BaseBdev4", 00:21:47.745 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:47.745 "is_configured": true, 00:21:47.745 "data_offset": 2048, 00:21:47.745 "data_size": 63488 00:21:47.745 } 00:21:47.745 ] 00:21:47.745 }' 00:21:47.745 09:25:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.745 09:25:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:48.373 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:48.373 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.659 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:48.659 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:48.916 [2024-07-15 09:25:57.635848] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:48.916 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:48.916 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.916 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.917 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.174 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.174 "name": "Existed_Raid", 00:21:49.174 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:49.174 "strip_size_kb": 0, 00:21:49.174 "state": "configuring", 00:21:49.174 "raid_level": "raid1", 00:21:49.174 "superblock": true, 00:21:49.174 "num_base_bdevs": 4, 00:21:49.174 "num_base_bdevs_discovered": 2, 00:21:49.174 "num_base_bdevs_operational": 4, 00:21:49.174 "base_bdevs_list": [ 00:21:49.174 { 00:21:49.174 "name": null, 00:21:49.174 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:49.174 "is_configured": false, 00:21:49.174 "data_offset": 2048, 00:21:49.174 "data_size": 63488 00:21:49.174 }, 00:21:49.174 { 00:21:49.174 "name": null, 00:21:49.174 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:49.174 "is_configured": false, 00:21:49.174 "data_offset": 2048, 00:21:49.174 "data_size": 63488 00:21:49.174 }, 00:21:49.174 { 00:21:49.174 "name": "BaseBdev3", 00:21:49.174 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:49.174 "is_configured": true, 00:21:49.174 "data_offset": 2048, 00:21:49.174 "data_size": 63488 00:21:49.174 }, 00:21:49.174 { 00:21:49.174 "name": "BaseBdev4", 00:21:49.174 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:49.174 "is_configured": true, 00:21:49.174 "data_offset": 2048, 00:21:49.174 "data_size": 63488 00:21:49.174 } 00:21:49.174 ] 00:21:49.174 }' 00:21:49.174 09:25:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.174 09:25:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.739 09:25:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.739 09:25:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:50.022 09:25:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:50.022 09:25:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:50.280 [2024-07-15 09:25:58.991740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.280 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.537 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.538 "name": "Existed_Raid", 00:21:50.538 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:50.538 "strip_size_kb": 0, 00:21:50.538 "state": "configuring", 00:21:50.538 "raid_level": "raid1", 00:21:50.538 "superblock": true, 00:21:50.538 "num_base_bdevs": 4, 00:21:50.538 "num_base_bdevs_discovered": 3, 00:21:50.538 "num_base_bdevs_operational": 4, 00:21:50.538 "base_bdevs_list": [ 00:21:50.538 { 00:21:50.538 "name": null, 00:21:50.538 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:50.538 "is_configured": false, 00:21:50.538 "data_offset": 2048, 00:21:50.538 "data_size": 63488 00:21:50.538 }, 00:21:50.538 { 00:21:50.538 "name": "BaseBdev2", 00:21:50.538 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:50.538 "is_configured": true, 00:21:50.538 "data_offset": 2048, 00:21:50.538 "data_size": 63488 00:21:50.538 }, 00:21:50.538 { 00:21:50.538 "name": "BaseBdev3", 00:21:50.538 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:50.538 "is_configured": true, 00:21:50.538 "data_offset": 2048, 00:21:50.538 "data_size": 63488 00:21:50.538 }, 00:21:50.538 { 00:21:50.538 "name": "BaseBdev4", 00:21:50.538 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:50.538 "is_configured": true, 00:21:50.538 "data_offset": 2048, 00:21:50.538 "data_size": 63488 00:21:50.538 } 00:21:50.538 ] 00:21:50.538 }' 00:21:50.538 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.538 09:25:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:51.103 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.103 09:25:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:51.363 09:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:51.364 09:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.364 09:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:51.622 09:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 27e54397-59d8-4264-a442-9b2790b132ba 00:21:51.880 [2024-07-15 09:26:00.592955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:51.880 [2024-07-15 09:26:00.593142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1312180 00:21:51.880 [2024-07-15 09:26:00.593155] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:51.880 [2024-07-15 09:26:00.593337] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1312c20 00:21:51.880 [2024-07-15 09:26:00.593488] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1312180 00:21:51.880 [2024-07-15 09:26:00.593499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1312180 00:21:51.880 [2024-07-15 09:26:00.593608] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.880 NewBaseBdev 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:51.880 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:52.138 09:26:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:52.138 [ 00:21:52.138 { 00:21:52.138 "name": "NewBaseBdev", 00:21:52.138 "aliases": [ 00:21:52.138 "27e54397-59d8-4264-a442-9b2790b132ba" 00:21:52.138 ], 00:21:52.138 "product_name": "Malloc disk", 00:21:52.138 "block_size": 512, 00:21:52.138 "num_blocks": 65536, 00:21:52.138 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:52.138 "assigned_rate_limits": { 00:21:52.138 "rw_ios_per_sec": 0, 00:21:52.138 "rw_mbytes_per_sec": 0, 00:21:52.138 "r_mbytes_per_sec": 0, 00:21:52.138 "w_mbytes_per_sec": 0 00:21:52.138 }, 00:21:52.138 "claimed": true, 00:21:52.138 "claim_type": "exclusive_write", 00:21:52.138 "zoned": false, 00:21:52.138 "supported_io_types": { 00:21:52.138 "read": true, 00:21:52.138 "write": true, 00:21:52.138 "unmap": true, 00:21:52.138 "flush": true, 00:21:52.138 "reset": true, 00:21:52.138 "nvme_admin": false, 00:21:52.138 "nvme_io": false, 00:21:52.138 "nvme_io_md": false, 00:21:52.138 "write_zeroes": true, 00:21:52.138 "zcopy": true, 00:21:52.138 "get_zone_info": false, 00:21:52.138 "zone_management": false, 00:21:52.138 "zone_append": false, 00:21:52.138 "compare": false, 00:21:52.138 "compare_and_write": false, 00:21:52.138 "abort": true, 00:21:52.138 "seek_hole": false, 00:21:52.138 "seek_data": false, 00:21:52.138 "copy": true, 00:21:52.138 "nvme_iov_md": false 00:21:52.138 }, 00:21:52.138 "memory_domains": [ 00:21:52.138 { 00:21:52.138 "dma_device_id": "system", 00:21:52.138 "dma_device_type": 1 00:21:52.138 }, 00:21:52.138 { 00:21:52.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.138 "dma_device_type": 2 00:21:52.138 } 00:21:52.138 ], 00:21:52.138 "driver_specific": {} 00:21:52.138 } 00:21:52.138 ] 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.397 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.656 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.656 "name": "Existed_Raid", 00:21:52.656 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:52.656 "strip_size_kb": 0, 00:21:52.656 "state": "online", 00:21:52.656 "raid_level": "raid1", 00:21:52.656 "superblock": true, 00:21:52.656 "num_base_bdevs": 4, 00:21:52.656 "num_base_bdevs_discovered": 4, 00:21:52.656 "num_base_bdevs_operational": 4, 00:21:52.656 "base_bdevs_list": [ 00:21:52.656 { 00:21:52.656 "name": "NewBaseBdev", 00:21:52.656 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:52.656 "is_configured": true, 00:21:52.656 "data_offset": 2048, 00:21:52.656 "data_size": 63488 00:21:52.656 }, 00:21:52.656 { 00:21:52.656 "name": "BaseBdev2", 00:21:52.656 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:52.656 "is_configured": true, 00:21:52.656 "data_offset": 2048, 00:21:52.656 "data_size": 63488 00:21:52.656 }, 00:21:52.656 { 00:21:52.656 "name": "BaseBdev3", 00:21:52.656 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:52.656 "is_configured": true, 00:21:52.656 "data_offset": 2048, 00:21:52.656 "data_size": 63488 00:21:52.656 }, 00:21:52.656 { 00:21:52.656 "name": "BaseBdev4", 00:21:52.656 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:52.656 "is_configured": true, 00:21:52.656 "data_offset": 2048, 00:21:52.656 "data_size": 63488 00:21:52.656 } 00:21:52.656 ] 00:21:52.656 }' 00:21:52.656 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.656 09:26:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:53.221 09:26:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:53.221 [2024-07-15 09:26:02.149378] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.479 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:53.479 "name": "Existed_Raid", 00:21:53.479 "aliases": [ 00:21:53.479 "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4" 00:21:53.479 ], 00:21:53.479 "product_name": "Raid Volume", 00:21:53.479 "block_size": 512, 00:21:53.479 "num_blocks": 63488, 00:21:53.479 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:53.479 "assigned_rate_limits": { 00:21:53.479 "rw_ios_per_sec": 0, 00:21:53.479 "rw_mbytes_per_sec": 0, 00:21:53.479 "r_mbytes_per_sec": 0, 00:21:53.479 "w_mbytes_per_sec": 0 00:21:53.479 }, 00:21:53.479 "claimed": false, 00:21:53.479 "zoned": false, 00:21:53.479 "supported_io_types": { 00:21:53.479 "read": true, 00:21:53.479 "write": true, 00:21:53.479 "unmap": false, 00:21:53.479 "flush": false, 00:21:53.479 "reset": true, 00:21:53.479 "nvme_admin": false, 00:21:53.479 "nvme_io": false, 00:21:53.479 "nvme_io_md": false, 00:21:53.479 "write_zeroes": true, 00:21:53.479 "zcopy": false, 00:21:53.479 "get_zone_info": false, 00:21:53.479 "zone_management": false, 00:21:53.479 "zone_append": false, 00:21:53.479 "compare": false, 00:21:53.479 "compare_and_write": false, 00:21:53.479 "abort": false, 00:21:53.479 "seek_hole": false, 00:21:53.479 "seek_data": false, 00:21:53.479 "copy": false, 00:21:53.479 "nvme_iov_md": false 00:21:53.479 }, 00:21:53.479 "memory_domains": [ 00:21:53.479 { 00:21:53.479 "dma_device_id": "system", 00:21:53.479 "dma_device_type": 1 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.479 "dma_device_type": 2 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "system", 00:21:53.479 "dma_device_type": 1 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.479 "dma_device_type": 2 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "system", 00:21:53.479 "dma_device_type": 1 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.479 "dma_device_type": 2 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "system", 00:21:53.479 "dma_device_type": 1 00:21:53.479 }, 00:21:53.479 { 00:21:53.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.479 "dma_device_type": 2 00:21:53.479 } 00:21:53.479 ], 00:21:53.479 "driver_specific": { 00:21:53.479 "raid": { 00:21:53.479 "uuid": "49a5dbfe-ff16-4241-91f3-9d45a1afe3e4", 00:21:53.479 "strip_size_kb": 0, 00:21:53.479 "state": "online", 00:21:53.479 "raid_level": "raid1", 00:21:53.479 "superblock": true, 00:21:53.479 "num_base_bdevs": 4, 00:21:53.479 "num_base_bdevs_discovered": 4, 00:21:53.479 "num_base_bdevs_operational": 4, 00:21:53.479 "base_bdevs_list": [ 00:21:53.479 { 00:21:53.480 "name": "NewBaseBdev", 00:21:53.480 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:53.480 "is_configured": true, 00:21:53.480 "data_offset": 2048, 00:21:53.480 "data_size": 63488 00:21:53.480 }, 00:21:53.480 { 00:21:53.480 "name": "BaseBdev2", 00:21:53.480 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:53.480 "is_configured": true, 00:21:53.480 "data_offset": 2048, 00:21:53.480 "data_size": 63488 00:21:53.480 }, 00:21:53.480 { 00:21:53.480 "name": "BaseBdev3", 00:21:53.480 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:53.480 "is_configured": true, 00:21:53.480 "data_offset": 2048, 00:21:53.480 "data_size": 63488 00:21:53.480 }, 00:21:53.480 { 00:21:53.480 "name": "BaseBdev4", 00:21:53.480 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:53.480 "is_configured": true, 00:21:53.480 "data_offset": 2048, 00:21:53.480 "data_size": 63488 00:21:53.480 } 00:21:53.480 ] 00:21:53.480 } 00:21:53.480 } 00:21:53.480 }' 00:21:53.480 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:53.480 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:53.480 BaseBdev2 00:21:53.480 BaseBdev3 00:21:53.480 BaseBdev4' 00:21:53.480 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.480 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.480 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.738 "name": "NewBaseBdev", 00:21:53.738 "aliases": [ 00:21:53.738 "27e54397-59d8-4264-a442-9b2790b132ba" 00:21:53.738 ], 00:21:53.738 "product_name": "Malloc disk", 00:21:53.738 "block_size": 512, 00:21:53.738 "num_blocks": 65536, 00:21:53.738 "uuid": "27e54397-59d8-4264-a442-9b2790b132ba", 00:21:53.738 "assigned_rate_limits": { 00:21:53.738 "rw_ios_per_sec": 0, 00:21:53.738 "rw_mbytes_per_sec": 0, 00:21:53.738 "r_mbytes_per_sec": 0, 00:21:53.738 "w_mbytes_per_sec": 0 00:21:53.738 }, 00:21:53.738 "claimed": true, 00:21:53.738 "claim_type": "exclusive_write", 00:21:53.738 "zoned": false, 00:21:53.738 "supported_io_types": { 00:21:53.738 "read": true, 00:21:53.738 "write": true, 00:21:53.738 "unmap": true, 00:21:53.738 "flush": true, 00:21:53.738 "reset": true, 00:21:53.738 "nvme_admin": false, 00:21:53.738 "nvme_io": false, 00:21:53.738 "nvme_io_md": false, 00:21:53.738 "write_zeroes": true, 00:21:53.738 "zcopy": true, 00:21:53.738 "get_zone_info": false, 00:21:53.738 "zone_management": false, 00:21:53.738 "zone_append": false, 00:21:53.738 "compare": false, 00:21:53.738 "compare_and_write": false, 00:21:53.738 "abort": true, 00:21:53.738 "seek_hole": false, 00:21:53.738 "seek_data": false, 00:21:53.738 "copy": true, 00:21:53.738 "nvme_iov_md": false 00:21:53.738 }, 00:21:53.738 "memory_domains": [ 00:21:53.738 { 00:21:53.738 "dma_device_id": "system", 00:21:53.738 "dma_device_type": 1 00:21:53.738 }, 00:21:53.738 { 00:21:53.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.738 "dma_device_type": 2 00:21:53.738 } 00:21:53.738 ], 00:21:53.738 "driver_specific": {} 00:21:53.738 }' 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.738 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:53.996 09:26:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.253 "name": "BaseBdev2", 00:21:54.253 "aliases": [ 00:21:54.253 "4621af3e-ea64-4a16-bdc5-f8ae3185654b" 00:21:54.253 ], 00:21:54.253 "product_name": "Malloc disk", 00:21:54.253 "block_size": 512, 00:21:54.253 "num_blocks": 65536, 00:21:54.253 "uuid": "4621af3e-ea64-4a16-bdc5-f8ae3185654b", 00:21:54.253 "assigned_rate_limits": { 00:21:54.253 "rw_ios_per_sec": 0, 00:21:54.253 "rw_mbytes_per_sec": 0, 00:21:54.253 "r_mbytes_per_sec": 0, 00:21:54.253 "w_mbytes_per_sec": 0 00:21:54.253 }, 00:21:54.253 "claimed": true, 00:21:54.253 "claim_type": "exclusive_write", 00:21:54.253 "zoned": false, 00:21:54.253 "supported_io_types": { 00:21:54.253 "read": true, 00:21:54.253 "write": true, 00:21:54.253 "unmap": true, 00:21:54.253 "flush": true, 00:21:54.253 "reset": true, 00:21:54.253 "nvme_admin": false, 00:21:54.253 "nvme_io": false, 00:21:54.253 "nvme_io_md": false, 00:21:54.253 "write_zeroes": true, 00:21:54.253 "zcopy": true, 00:21:54.253 "get_zone_info": false, 00:21:54.253 "zone_management": false, 00:21:54.253 "zone_append": false, 00:21:54.253 "compare": false, 00:21:54.253 "compare_and_write": false, 00:21:54.253 "abort": true, 00:21:54.253 "seek_hole": false, 00:21:54.253 "seek_data": false, 00:21:54.253 "copy": true, 00:21:54.253 "nvme_iov_md": false 00:21:54.253 }, 00:21:54.253 "memory_domains": [ 00:21:54.253 { 00:21:54.253 "dma_device_id": "system", 00:21:54.253 "dma_device_type": 1 00:21:54.253 }, 00:21:54.253 { 00:21:54.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.253 "dma_device_type": 2 00:21:54.253 } 00:21:54.253 ], 00:21:54.253 "driver_specific": {} 00:21:54.253 }' 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.253 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.510 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.510 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.510 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.510 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:54.511 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.768 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.768 "name": "BaseBdev3", 00:21:54.768 "aliases": [ 00:21:54.768 "a56bfba1-f98e-457c-91bb-71c2a4ac6246" 00:21:54.768 ], 00:21:54.768 "product_name": "Malloc disk", 00:21:54.768 "block_size": 512, 00:21:54.768 "num_blocks": 65536, 00:21:54.768 "uuid": "a56bfba1-f98e-457c-91bb-71c2a4ac6246", 00:21:54.768 "assigned_rate_limits": { 00:21:54.768 "rw_ios_per_sec": 0, 00:21:54.768 "rw_mbytes_per_sec": 0, 00:21:54.768 "r_mbytes_per_sec": 0, 00:21:54.768 "w_mbytes_per_sec": 0 00:21:54.768 }, 00:21:54.768 "claimed": true, 00:21:54.768 "claim_type": "exclusive_write", 00:21:54.768 "zoned": false, 00:21:54.768 "supported_io_types": { 00:21:54.768 "read": true, 00:21:54.768 "write": true, 00:21:54.768 "unmap": true, 00:21:54.768 "flush": true, 00:21:54.768 "reset": true, 00:21:54.768 "nvme_admin": false, 00:21:54.768 "nvme_io": false, 00:21:54.768 "nvme_io_md": false, 00:21:54.768 "write_zeroes": true, 00:21:54.768 "zcopy": true, 00:21:54.768 "get_zone_info": false, 00:21:54.768 "zone_management": false, 00:21:54.768 "zone_append": false, 00:21:54.768 "compare": false, 00:21:54.768 "compare_and_write": false, 00:21:54.768 "abort": true, 00:21:54.768 "seek_hole": false, 00:21:54.768 "seek_data": false, 00:21:54.768 "copy": true, 00:21:54.768 "nvme_iov_md": false 00:21:54.768 }, 00:21:54.768 "memory_domains": [ 00:21:54.768 { 00:21:54.768 "dma_device_id": "system", 00:21:54.768 "dma_device_type": 1 00:21:54.768 }, 00:21:54.768 { 00:21:54.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.768 "dma_device_type": 2 00:21:54.768 } 00:21:54.768 ], 00:21:54.768 "driver_specific": {} 00:21:54.768 }' 00:21:54.768 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.768 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.026 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.027 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.285 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.285 09:26:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:55.285 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:55.285 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:55.544 "name": "BaseBdev4", 00:21:55.544 "aliases": [ 00:21:55.544 "057bc3e3-f719-42c9-80a3-e6f215e56d2f" 00:21:55.544 ], 00:21:55.544 "product_name": "Malloc disk", 00:21:55.544 "block_size": 512, 00:21:55.544 "num_blocks": 65536, 00:21:55.544 "uuid": "057bc3e3-f719-42c9-80a3-e6f215e56d2f", 00:21:55.544 "assigned_rate_limits": { 00:21:55.544 "rw_ios_per_sec": 0, 00:21:55.544 "rw_mbytes_per_sec": 0, 00:21:55.544 "r_mbytes_per_sec": 0, 00:21:55.544 "w_mbytes_per_sec": 0 00:21:55.544 }, 00:21:55.544 "claimed": true, 00:21:55.544 "claim_type": "exclusive_write", 00:21:55.544 "zoned": false, 00:21:55.544 "supported_io_types": { 00:21:55.544 "read": true, 00:21:55.544 "write": true, 00:21:55.544 "unmap": true, 00:21:55.544 "flush": true, 00:21:55.544 "reset": true, 00:21:55.544 "nvme_admin": false, 00:21:55.544 "nvme_io": false, 00:21:55.544 "nvme_io_md": false, 00:21:55.544 "write_zeroes": true, 00:21:55.544 "zcopy": true, 00:21:55.544 "get_zone_info": false, 00:21:55.544 "zone_management": false, 00:21:55.544 "zone_append": false, 00:21:55.544 "compare": false, 00:21:55.544 "compare_and_write": false, 00:21:55.544 "abort": true, 00:21:55.544 "seek_hole": false, 00:21:55.544 "seek_data": false, 00:21:55.544 "copy": true, 00:21:55.544 "nvme_iov_md": false 00:21:55.544 }, 00:21:55.544 "memory_domains": [ 00:21:55.544 { 00:21:55.544 "dma_device_id": "system", 00:21:55.544 "dma_device_type": 1 00:21:55.544 }, 00:21:55.544 { 00:21:55.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.544 "dma_device_type": 2 00:21:55.544 } 00:21:55.544 ], 00:21:55.544 "driver_specific": {} 00:21:55.544 }' 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.544 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.803 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.803 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.803 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.803 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.803 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:56.061 [2024-07-15 09:26:04.832194] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:56.061 [2024-07-15 09:26:04.832222] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:56.061 [2024-07-15 09:26:04.832278] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:56.061 [2024-07-15 09:26:04.832581] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:56.061 [2024-07-15 09:26:04.832594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1312180 name Existed_Raid, state offline 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 179707 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 179707 ']' 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 179707 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 179707 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 179707' 00:21:56.061 killing process with pid 179707 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 179707 00:21:56.061 [2024-07-15 09:26:04.902141] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:56.061 09:26:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 179707 00:21:56.061 [2024-07-15 09:26:04.969813] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:56.651 09:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:56.651 00:21:56.651 real 0m32.778s 00:21:56.651 user 0m59.961s 00:21:56.651 sys 0m5.871s 00:21:56.651 09:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:56.651 09:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:56.651 ************************************ 00:21:56.651 END TEST raid_state_function_test_sb 00:21:56.651 ************************************ 00:21:56.651 09:26:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:56.651 09:26:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:56.651 09:26:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:56.651 09:26:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:56.651 09:26:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:56.651 ************************************ 00:21:56.651 START TEST raid_superblock_test 00:21:56.651 ************************************ 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=184586 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 184586 /var/tmp/spdk-raid.sock 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 184586 ']' 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:56.651 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:56.651 09:26:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.651 [2024-07-15 09:26:05.445415] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:21:56.651 [2024-07-15 09:26:05.445480] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184586 ] 00:21:56.651 [2024-07-15 09:26:05.573494] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:56.909 [2024-07-15 09:26:05.671931] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:56.909 [2024-07-15 09:26:05.737053] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:56.909 [2024-07-15 09:26:05.737090] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:57.475 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:57.733 malloc1 00:21:57.733 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:57.991 [2024-07-15 09:26:06.782966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:57.991 [2024-07-15 09:26:06.783015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.991 [2024-07-15 09:26:06.783039] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180c570 00:21:57.991 [2024-07-15 09:26:06.783052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.991 [2024-07-15 09:26:06.784663] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.991 [2024-07-15 09:26:06.784693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:57.991 pt1 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:57.991 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:58.249 malloc2 00:21:58.249 09:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:58.249 [2024-07-15 09:26:07.120543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:58.249 [2024-07-15 09:26:07.120589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.249 [2024-07-15 09:26:07.120605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180d970 00:21:58.249 [2024-07-15 09:26:07.120617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.249 [2024-07-15 09:26:07.122050] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.249 [2024-07-15 09:26:07.122078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:58.249 pt2 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:58.249 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:58.505 malloc3 00:21:58.505 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:58.761 [2024-07-15 09:26:07.461961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:58.761 [2024-07-15 09:26:07.462007] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.761 [2024-07-15 09:26:07.462025] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a4340 00:21:58.761 [2024-07-15 09:26:07.462038] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.761 [2024-07-15 09:26:07.463487] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.761 [2024-07-15 09:26:07.463520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:58.761 pt3 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:58.761 malloc4 00:21:58.761 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:59.016 [2024-07-15 09:26:07.819542] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:59.016 [2024-07-15 09:26:07.819586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:59.016 [2024-07-15 09:26:07.819605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a6c60 00:21:59.016 [2024-07-15 09:26:07.819618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:59.016 [2024-07-15 09:26:07.821014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:59.016 [2024-07-15 09:26:07.821041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:59.016 pt4 00:21:59.016 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:59.016 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:59.016 09:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:59.273 [2024-07-15 09:26:07.992022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:59.273 [2024-07-15 09:26:07.993182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:59.273 [2024-07-15 09:26:07.993235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:59.273 [2024-07-15 09:26:07.993278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:59.273 [2024-07-15 09:26:07.993442] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1804530 00:21:59.273 [2024-07-15 09:26:07.993453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:59.273 [2024-07-15 09:26:07.993634] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1802770 00:21:59.273 [2024-07-15 09:26:07.993779] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1804530 00:21:59.273 [2024-07-15 09:26:07.993789] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1804530 00:21:59.273 [2024-07-15 09:26:07.993878] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.273 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.530 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.530 "name": "raid_bdev1", 00:21:59.530 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:21:59.530 "strip_size_kb": 0, 00:21:59.530 "state": "online", 00:21:59.530 "raid_level": "raid1", 00:21:59.530 "superblock": true, 00:21:59.530 "num_base_bdevs": 4, 00:21:59.530 "num_base_bdevs_discovered": 4, 00:21:59.530 "num_base_bdevs_operational": 4, 00:21:59.530 "base_bdevs_list": [ 00:21:59.530 { 00:21:59.530 "name": "pt1", 00:21:59.530 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:59.530 "is_configured": true, 00:21:59.530 "data_offset": 2048, 00:21:59.530 "data_size": 63488 00:21:59.530 }, 00:21:59.530 { 00:21:59.530 "name": "pt2", 00:21:59.530 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.530 "is_configured": true, 00:21:59.530 "data_offset": 2048, 00:21:59.530 "data_size": 63488 00:21:59.530 }, 00:21:59.530 { 00:21:59.530 "name": "pt3", 00:21:59.530 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:59.530 "is_configured": true, 00:21:59.530 "data_offset": 2048, 00:21:59.530 "data_size": 63488 00:21:59.530 }, 00:21:59.530 { 00:21:59.530 "name": "pt4", 00:21:59.530 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:59.530 "is_configured": true, 00:21:59.530 "data_offset": 2048, 00:21:59.530 "data_size": 63488 00:21:59.530 } 00:21:59.530 ] 00:21:59.530 }' 00:21:59.530 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.530 09:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:00.095 09:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:00.353 [2024-07-15 09:26:09.051139] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:00.353 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:00.353 "name": "raid_bdev1", 00:22:00.353 "aliases": [ 00:22:00.353 "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2" 00:22:00.353 ], 00:22:00.353 "product_name": "Raid Volume", 00:22:00.353 "block_size": 512, 00:22:00.353 "num_blocks": 63488, 00:22:00.353 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:00.353 "assigned_rate_limits": { 00:22:00.353 "rw_ios_per_sec": 0, 00:22:00.353 "rw_mbytes_per_sec": 0, 00:22:00.353 "r_mbytes_per_sec": 0, 00:22:00.353 "w_mbytes_per_sec": 0 00:22:00.353 }, 00:22:00.353 "claimed": false, 00:22:00.353 "zoned": false, 00:22:00.353 "supported_io_types": { 00:22:00.353 "read": true, 00:22:00.353 "write": true, 00:22:00.353 "unmap": false, 00:22:00.353 "flush": false, 00:22:00.353 "reset": true, 00:22:00.353 "nvme_admin": false, 00:22:00.353 "nvme_io": false, 00:22:00.353 "nvme_io_md": false, 00:22:00.353 "write_zeroes": true, 00:22:00.353 "zcopy": false, 00:22:00.353 "get_zone_info": false, 00:22:00.353 "zone_management": false, 00:22:00.353 "zone_append": false, 00:22:00.353 "compare": false, 00:22:00.353 "compare_and_write": false, 00:22:00.353 "abort": false, 00:22:00.353 "seek_hole": false, 00:22:00.353 "seek_data": false, 00:22:00.353 "copy": false, 00:22:00.353 "nvme_iov_md": false 00:22:00.353 }, 00:22:00.353 "memory_domains": [ 00:22:00.353 { 00:22:00.353 "dma_device_id": "system", 00:22:00.353 "dma_device_type": 1 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.353 "dma_device_type": 2 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "system", 00:22:00.353 "dma_device_type": 1 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.353 "dma_device_type": 2 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "system", 00:22:00.353 "dma_device_type": 1 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.353 "dma_device_type": 2 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "system", 00:22:00.353 "dma_device_type": 1 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.353 "dma_device_type": 2 00:22:00.353 } 00:22:00.353 ], 00:22:00.353 "driver_specific": { 00:22:00.353 "raid": { 00:22:00.353 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:00.353 "strip_size_kb": 0, 00:22:00.353 "state": "online", 00:22:00.353 "raid_level": "raid1", 00:22:00.353 "superblock": true, 00:22:00.353 "num_base_bdevs": 4, 00:22:00.353 "num_base_bdevs_discovered": 4, 00:22:00.353 "num_base_bdevs_operational": 4, 00:22:00.353 "base_bdevs_list": [ 00:22:00.353 { 00:22:00.353 "name": "pt1", 00:22:00.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.353 "is_configured": true, 00:22:00.353 "data_offset": 2048, 00:22:00.353 "data_size": 63488 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "name": "pt2", 00:22:00.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:00.353 "is_configured": true, 00:22:00.353 "data_offset": 2048, 00:22:00.353 "data_size": 63488 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "name": "pt3", 00:22:00.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:00.353 "is_configured": true, 00:22:00.353 "data_offset": 2048, 00:22:00.353 "data_size": 63488 00:22:00.353 }, 00:22:00.353 { 00:22:00.353 "name": "pt4", 00:22:00.353 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:00.353 "is_configured": true, 00:22:00.353 "data_offset": 2048, 00:22:00.354 "data_size": 63488 00:22:00.354 } 00:22:00.354 ] 00:22:00.354 } 00:22:00.354 } 00:22:00.354 }' 00:22:00.354 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:00.354 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:00.354 pt2 00:22:00.354 pt3 00:22:00.354 pt4' 00:22:00.354 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:00.354 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:00.354 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:00.612 "name": "pt1", 00:22:00.612 "aliases": [ 00:22:00.612 "00000000-0000-0000-0000-000000000001" 00:22:00.612 ], 00:22:00.612 "product_name": "passthru", 00:22:00.612 "block_size": 512, 00:22:00.612 "num_blocks": 65536, 00:22:00.612 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.612 "assigned_rate_limits": { 00:22:00.612 "rw_ios_per_sec": 0, 00:22:00.612 "rw_mbytes_per_sec": 0, 00:22:00.612 "r_mbytes_per_sec": 0, 00:22:00.612 "w_mbytes_per_sec": 0 00:22:00.612 }, 00:22:00.612 "claimed": true, 00:22:00.612 "claim_type": "exclusive_write", 00:22:00.612 "zoned": false, 00:22:00.612 "supported_io_types": { 00:22:00.612 "read": true, 00:22:00.612 "write": true, 00:22:00.612 "unmap": true, 00:22:00.612 "flush": true, 00:22:00.612 "reset": true, 00:22:00.612 "nvme_admin": false, 00:22:00.612 "nvme_io": false, 00:22:00.612 "nvme_io_md": false, 00:22:00.612 "write_zeroes": true, 00:22:00.612 "zcopy": true, 00:22:00.612 "get_zone_info": false, 00:22:00.612 "zone_management": false, 00:22:00.612 "zone_append": false, 00:22:00.612 "compare": false, 00:22:00.612 "compare_and_write": false, 00:22:00.612 "abort": true, 00:22:00.612 "seek_hole": false, 00:22:00.612 "seek_data": false, 00:22:00.612 "copy": true, 00:22:00.612 "nvme_iov_md": false 00:22:00.612 }, 00:22:00.612 "memory_domains": [ 00:22:00.612 { 00:22:00.612 "dma_device_id": "system", 00:22:00.612 "dma_device_type": 1 00:22:00.612 }, 00:22:00.612 { 00:22:00.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.612 "dma_device_type": 2 00:22:00.612 } 00:22:00.612 ], 00:22:00.612 "driver_specific": { 00:22:00.612 "passthru": { 00:22:00.612 "name": "pt1", 00:22:00.612 "base_bdev_name": "malloc1" 00:22:00.612 } 00:22:00.612 } 00:22:00.612 }' 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:00.612 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:00.871 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:01.130 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:01.130 "name": "pt2", 00:22:01.130 "aliases": [ 00:22:01.130 "00000000-0000-0000-0000-000000000002" 00:22:01.130 ], 00:22:01.130 "product_name": "passthru", 00:22:01.130 "block_size": 512, 00:22:01.130 "num_blocks": 65536, 00:22:01.130 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.130 "assigned_rate_limits": { 00:22:01.130 "rw_ios_per_sec": 0, 00:22:01.130 "rw_mbytes_per_sec": 0, 00:22:01.130 "r_mbytes_per_sec": 0, 00:22:01.130 "w_mbytes_per_sec": 0 00:22:01.130 }, 00:22:01.130 "claimed": true, 00:22:01.130 "claim_type": "exclusive_write", 00:22:01.130 "zoned": false, 00:22:01.130 "supported_io_types": { 00:22:01.130 "read": true, 00:22:01.130 "write": true, 00:22:01.130 "unmap": true, 00:22:01.130 "flush": true, 00:22:01.130 "reset": true, 00:22:01.130 "nvme_admin": false, 00:22:01.130 "nvme_io": false, 00:22:01.130 "nvme_io_md": false, 00:22:01.130 "write_zeroes": true, 00:22:01.130 "zcopy": true, 00:22:01.130 "get_zone_info": false, 00:22:01.130 "zone_management": false, 00:22:01.130 "zone_append": false, 00:22:01.130 "compare": false, 00:22:01.130 "compare_and_write": false, 00:22:01.130 "abort": true, 00:22:01.130 "seek_hole": false, 00:22:01.130 "seek_data": false, 00:22:01.130 "copy": true, 00:22:01.130 "nvme_iov_md": false 00:22:01.130 }, 00:22:01.130 "memory_domains": [ 00:22:01.130 { 00:22:01.130 "dma_device_id": "system", 00:22:01.130 "dma_device_type": 1 00:22:01.130 }, 00:22:01.130 { 00:22:01.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.130 "dma_device_type": 2 00:22:01.130 } 00:22:01.130 ], 00:22:01.130 "driver_specific": { 00:22:01.130 "passthru": { 00:22:01.130 "name": "pt2", 00:22:01.130 "base_bdev_name": "malloc2" 00:22:01.130 } 00:22:01.130 } 00:22:01.130 }' 00:22:01.130 09:26:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.130 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.130 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:01.130 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.388 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:01.646 "name": "pt3", 00:22:01.646 "aliases": [ 00:22:01.646 "00000000-0000-0000-0000-000000000003" 00:22:01.646 ], 00:22:01.646 "product_name": "passthru", 00:22:01.646 "block_size": 512, 00:22:01.646 "num_blocks": 65536, 00:22:01.646 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:01.646 "assigned_rate_limits": { 00:22:01.646 "rw_ios_per_sec": 0, 00:22:01.646 "rw_mbytes_per_sec": 0, 00:22:01.646 "r_mbytes_per_sec": 0, 00:22:01.646 "w_mbytes_per_sec": 0 00:22:01.646 }, 00:22:01.646 "claimed": true, 00:22:01.646 "claim_type": "exclusive_write", 00:22:01.646 "zoned": false, 00:22:01.646 "supported_io_types": { 00:22:01.646 "read": true, 00:22:01.646 "write": true, 00:22:01.646 "unmap": true, 00:22:01.646 "flush": true, 00:22:01.646 "reset": true, 00:22:01.646 "nvme_admin": false, 00:22:01.646 "nvme_io": false, 00:22:01.646 "nvme_io_md": false, 00:22:01.646 "write_zeroes": true, 00:22:01.646 "zcopy": true, 00:22:01.646 "get_zone_info": false, 00:22:01.646 "zone_management": false, 00:22:01.646 "zone_append": false, 00:22:01.646 "compare": false, 00:22:01.646 "compare_and_write": false, 00:22:01.646 "abort": true, 00:22:01.646 "seek_hole": false, 00:22:01.646 "seek_data": false, 00:22:01.646 "copy": true, 00:22:01.646 "nvme_iov_md": false 00:22:01.646 }, 00:22:01.646 "memory_domains": [ 00:22:01.646 { 00:22:01.646 "dma_device_id": "system", 00:22:01.646 "dma_device_type": 1 00:22:01.646 }, 00:22:01.646 { 00:22:01.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.646 "dma_device_type": 2 00:22:01.646 } 00:22:01.646 ], 00:22:01.646 "driver_specific": { 00:22:01.646 "passthru": { 00:22:01.646 "name": "pt3", 00:22:01.646 "base_bdev_name": "malloc3" 00:22:01.646 } 00:22:01.646 } 00:22:01.646 }' 00:22:01.646 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:01.903 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.180 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.180 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.180 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.180 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:02.180 09:26:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.475 "name": "pt4", 00:22:02.475 "aliases": [ 00:22:02.475 "00000000-0000-0000-0000-000000000004" 00:22:02.475 ], 00:22:02.475 "product_name": "passthru", 00:22:02.475 "block_size": 512, 00:22:02.475 "num_blocks": 65536, 00:22:02.475 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:02.475 "assigned_rate_limits": { 00:22:02.475 "rw_ios_per_sec": 0, 00:22:02.475 "rw_mbytes_per_sec": 0, 00:22:02.475 "r_mbytes_per_sec": 0, 00:22:02.475 "w_mbytes_per_sec": 0 00:22:02.475 }, 00:22:02.475 "claimed": true, 00:22:02.475 "claim_type": "exclusive_write", 00:22:02.475 "zoned": false, 00:22:02.475 "supported_io_types": { 00:22:02.475 "read": true, 00:22:02.475 "write": true, 00:22:02.475 "unmap": true, 00:22:02.475 "flush": true, 00:22:02.475 "reset": true, 00:22:02.475 "nvme_admin": false, 00:22:02.475 "nvme_io": false, 00:22:02.475 "nvme_io_md": false, 00:22:02.475 "write_zeroes": true, 00:22:02.475 "zcopy": true, 00:22:02.475 "get_zone_info": false, 00:22:02.475 "zone_management": false, 00:22:02.475 "zone_append": false, 00:22:02.475 "compare": false, 00:22:02.475 "compare_and_write": false, 00:22:02.475 "abort": true, 00:22:02.475 "seek_hole": false, 00:22:02.475 "seek_data": false, 00:22:02.475 "copy": true, 00:22:02.475 "nvme_iov_md": false 00:22:02.475 }, 00:22:02.475 "memory_domains": [ 00:22:02.475 { 00:22:02.475 "dma_device_id": "system", 00:22:02.475 "dma_device_type": 1 00:22:02.475 }, 00:22:02.475 { 00:22:02.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.475 "dma_device_type": 2 00:22:02.475 } 00:22:02.475 ], 00:22:02.475 "driver_specific": { 00:22:02.475 "passthru": { 00:22:02.475 "name": "pt4", 00:22:02.475 "base_bdev_name": "malloc4" 00:22:02.475 } 00:22:02.475 } 00:22:02.475 }' 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.475 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:02.737 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:02.737 [2024-07-15 09:26:11.686149] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:02.995 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 00:22:02.995 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 ']' 00:22:02.995 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:02.995 [2024-07-15 09:26:11.918418] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:02.995 [2024-07-15 09:26:11.918439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:02.995 [2024-07-15 09:26:11.918494] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:02.995 [2024-07-15 09:26:11.918581] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:02.995 [2024-07-15 09:26:11.918594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1804530 name raid_bdev1, state offline 00:22:02.995 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.995 09:26:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:03.254 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:03.254 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:03.254 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:03.254 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:03.820 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:03.820 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:04.077 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:04.077 09:26:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:04.643 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:04.643 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:04.901 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:04.901 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:05.160 09:26:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:05.418 [2024-07-15 09:26:14.128144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:05.418 [2024-07-15 09:26:14.129564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:05.418 [2024-07-15 09:26:14.129609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:05.418 [2024-07-15 09:26:14.129643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:05.418 [2024-07-15 09:26:14.129691] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:05.418 [2024-07-15 09:26:14.129732] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:05.418 [2024-07-15 09:26:14.129755] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:05.418 [2024-07-15 09:26:14.129778] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:05.418 [2024-07-15 09:26:14.129796] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:05.418 [2024-07-15 09:26:14.129807] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19afff0 name raid_bdev1, state configuring 00:22:05.418 request: 00:22:05.418 { 00:22:05.418 "name": "raid_bdev1", 00:22:05.418 "raid_level": "raid1", 00:22:05.418 "base_bdevs": [ 00:22:05.418 "malloc1", 00:22:05.418 "malloc2", 00:22:05.418 "malloc3", 00:22:05.418 "malloc4" 00:22:05.418 ], 00:22:05.418 "superblock": false, 00:22:05.418 "method": "bdev_raid_create", 00:22:05.418 "req_id": 1 00:22:05.418 } 00:22:05.418 Got JSON-RPC error response 00:22:05.418 response: 00:22:05.418 { 00:22:05.418 "code": -17, 00:22:05.418 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:05.418 } 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.418 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:05.983 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:05.983 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:05.983 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:05.983 [2024-07-15 09:26:14.878038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:05.983 [2024-07-15 09:26:14.878091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.983 [2024-07-15 09:26:14.878114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180c7a0 00:22:05.983 [2024-07-15 09:26:14.878126] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.983 [2024-07-15 09:26:14.879740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.983 [2024-07-15 09:26:14.879769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:05.983 [2024-07-15 09:26:14.879840] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:05.984 [2024-07-15 09:26:14.879867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:05.984 pt1 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.984 09:26:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.550 09:26:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.550 "name": "raid_bdev1", 00:22:06.550 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:06.550 "strip_size_kb": 0, 00:22:06.550 "state": "configuring", 00:22:06.550 "raid_level": "raid1", 00:22:06.550 "superblock": true, 00:22:06.550 "num_base_bdevs": 4, 00:22:06.550 "num_base_bdevs_discovered": 1, 00:22:06.550 "num_base_bdevs_operational": 4, 00:22:06.550 "base_bdevs_list": [ 00:22:06.550 { 00:22:06.550 "name": "pt1", 00:22:06.550 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:06.550 "is_configured": true, 00:22:06.550 "data_offset": 2048, 00:22:06.550 "data_size": 63488 00:22:06.550 }, 00:22:06.550 { 00:22:06.550 "name": null, 00:22:06.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:06.550 "is_configured": false, 00:22:06.550 "data_offset": 2048, 00:22:06.550 "data_size": 63488 00:22:06.550 }, 00:22:06.550 { 00:22:06.550 "name": null, 00:22:06.550 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:06.550 "is_configured": false, 00:22:06.550 "data_offset": 2048, 00:22:06.550 "data_size": 63488 00:22:06.550 }, 00:22:06.550 { 00:22:06.550 "name": null, 00:22:06.550 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:06.550 "is_configured": false, 00:22:06.550 "data_offset": 2048, 00:22:06.550 "data_size": 63488 00:22:06.550 } 00:22:06.550 ] 00:22:06.550 }' 00:22:06.550 09:26:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.550 09:26:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.115 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:07.115 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:07.427 [2024-07-15 09:26:16.241653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:07.427 [2024-07-15 09:26:16.241707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.427 [2024-07-15 09:26:16.241733] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a5940 00:22:07.427 [2024-07-15 09:26:16.241746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.427 [2024-07-15 09:26:16.242110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.427 [2024-07-15 09:26:16.242128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:07.427 [2024-07-15 09:26:16.242191] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:07.427 [2024-07-15 09:26:16.242210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:07.427 pt2 00:22:07.427 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:07.684 [2024-07-15 09:26:16.490343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.684 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.941 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.941 "name": "raid_bdev1", 00:22:07.941 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:07.941 "strip_size_kb": 0, 00:22:07.941 "state": "configuring", 00:22:07.941 "raid_level": "raid1", 00:22:07.941 "superblock": true, 00:22:07.941 "num_base_bdevs": 4, 00:22:07.941 "num_base_bdevs_discovered": 1, 00:22:07.941 "num_base_bdevs_operational": 4, 00:22:07.941 "base_bdevs_list": [ 00:22:07.941 { 00:22:07.941 "name": "pt1", 00:22:07.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:07.942 "is_configured": true, 00:22:07.942 "data_offset": 2048, 00:22:07.942 "data_size": 63488 00:22:07.942 }, 00:22:07.942 { 00:22:07.942 "name": null, 00:22:07.942 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:07.942 "is_configured": false, 00:22:07.942 "data_offset": 2048, 00:22:07.942 "data_size": 63488 00:22:07.942 }, 00:22:07.942 { 00:22:07.942 "name": null, 00:22:07.942 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:07.942 "is_configured": false, 00:22:07.942 "data_offset": 2048, 00:22:07.942 "data_size": 63488 00:22:07.942 }, 00:22:07.942 { 00:22:07.942 "name": null, 00:22:07.942 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:07.942 "is_configured": false, 00:22:07.942 "data_offset": 2048, 00:22:07.942 "data_size": 63488 00:22:07.942 } 00:22:07.942 ] 00:22:07.942 }' 00:22:07.942 09:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.942 09:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.507 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:08.507 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:08.507 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:08.765 [2024-07-15 09:26:17.585261] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:08.765 [2024-07-15 09:26:17.585314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.765 [2024-07-15 09:26:17.585339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1803060 00:22:08.765 [2024-07-15 09:26:17.585352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.765 [2024-07-15 09:26:17.585701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.765 [2024-07-15 09:26:17.585718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:08.765 [2024-07-15 09:26:17.585782] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:08.765 [2024-07-15 09:26:17.585800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:08.765 pt2 00:22:08.765 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:08.765 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:08.765 09:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:09.331 [2024-07-15 09:26:18.082579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:09.331 [2024-07-15 09:26:18.082626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.331 [2024-07-15 09:26:18.082647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18058d0 00:22:09.331 [2024-07-15 09:26:18.082661] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.331 [2024-07-15 09:26:18.083008] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.331 [2024-07-15 09:26:18.083027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:09.331 [2024-07-15 09:26:18.083095] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:09.331 [2024-07-15 09:26:18.083113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:09.331 pt3 00:22:09.331 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:09.331 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:09.331 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:09.590 [2024-07-15 09:26:18.331243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:09.590 [2024-07-15 09:26:18.331284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.590 [2024-07-15 09:26:18.331302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1806b80 00:22:09.590 [2024-07-15 09:26:18.331314] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.590 [2024-07-15 09:26:18.331639] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.590 [2024-07-15 09:26:18.331657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:09.590 [2024-07-15 09:26:18.331719] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:09.590 [2024-07-15 09:26:18.331738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:09.590 [2024-07-15 09:26:18.331864] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1803780 00:22:09.590 [2024-07-15 09:26:18.331875] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:09.590 [2024-07-15 09:26:18.332068] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1808fa0 00:22:09.590 [2024-07-15 09:26:18.332208] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1803780 00:22:09.590 [2024-07-15 09:26:18.332218] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1803780 00:22:09.590 [2024-07-15 09:26:18.332319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.590 pt4 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.590 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.157 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.157 "name": "raid_bdev1", 00:22:10.157 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:10.157 "strip_size_kb": 0, 00:22:10.157 "state": "online", 00:22:10.157 "raid_level": "raid1", 00:22:10.157 "superblock": true, 00:22:10.157 "num_base_bdevs": 4, 00:22:10.157 "num_base_bdevs_discovered": 4, 00:22:10.157 "num_base_bdevs_operational": 4, 00:22:10.157 "base_bdevs_list": [ 00:22:10.157 { 00:22:10.157 "name": "pt1", 00:22:10.157 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:10.157 "is_configured": true, 00:22:10.157 "data_offset": 2048, 00:22:10.157 "data_size": 63488 00:22:10.157 }, 00:22:10.157 { 00:22:10.157 "name": "pt2", 00:22:10.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:10.157 "is_configured": true, 00:22:10.157 "data_offset": 2048, 00:22:10.157 "data_size": 63488 00:22:10.157 }, 00:22:10.157 { 00:22:10.157 "name": "pt3", 00:22:10.157 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:10.157 "is_configured": true, 00:22:10.157 "data_offset": 2048, 00:22:10.157 "data_size": 63488 00:22:10.157 }, 00:22:10.157 { 00:22:10.157 "name": "pt4", 00:22:10.157 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:10.157 "is_configured": true, 00:22:10.157 "data_offset": 2048, 00:22:10.157 "data_size": 63488 00:22:10.157 } 00:22:10.157 ] 00:22:10.157 }' 00:22:10.157 09:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.157 09:26:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:10.725 [2024-07-15 09:26:19.614965] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:10.725 "name": "raid_bdev1", 00:22:10.725 "aliases": [ 00:22:10.725 "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2" 00:22:10.725 ], 00:22:10.725 "product_name": "Raid Volume", 00:22:10.725 "block_size": 512, 00:22:10.725 "num_blocks": 63488, 00:22:10.725 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:10.725 "assigned_rate_limits": { 00:22:10.725 "rw_ios_per_sec": 0, 00:22:10.725 "rw_mbytes_per_sec": 0, 00:22:10.725 "r_mbytes_per_sec": 0, 00:22:10.725 "w_mbytes_per_sec": 0 00:22:10.725 }, 00:22:10.725 "claimed": false, 00:22:10.725 "zoned": false, 00:22:10.725 "supported_io_types": { 00:22:10.725 "read": true, 00:22:10.725 "write": true, 00:22:10.725 "unmap": false, 00:22:10.725 "flush": false, 00:22:10.725 "reset": true, 00:22:10.725 "nvme_admin": false, 00:22:10.725 "nvme_io": false, 00:22:10.725 "nvme_io_md": false, 00:22:10.725 "write_zeroes": true, 00:22:10.725 "zcopy": false, 00:22:10.725 "get_zone_info": false, 00:22:10.725 "zone_management": false, 00:22:10.725 "zone_append": false, 00:22:10.725 "compare": false, 00:22:10.725 "compare_and_write": false, 00:22:10.725 "abort": false, 00:22:10.725 "seek_hole": false, 00:22:10.725 "seek_data": false, 00:22:10.725 "copy": false, 00:22:10.725 "nvme_iov_md": false 00:22:10.725 }, 00:22:10.725 "memory_domains": [ 00:22:10.725 { 00:22:10.725 "dma_device_id": "system", 00:22:10.725 "dma_device_type": 1 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.725 "dma_device_type": 2 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "system", 00:22:10.725 "dma_device_type": 1 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.725 "dma_device_type": 2 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "system", 00:22:10.725 "dma_device_type": 1 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.725 "dma_device_type": 2 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "system", 00:22:10.725 "dma_device_type": 1 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.725 "dma_device_type": 2 00:22:10.725 } 00:22:10.725 ], 00:22:10.725 "driver_specific": { 00:22:10.725 "raid": { 00:22:10.725 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:10.725 "strip_size_kb": 0, 00:22:10.725 "state": "online", 00:22:10.725 "raid_level": "raid1", 00:22:10.725 "superblock": true, 00:22:10.725 "num_base_bdevs": 4, 00:22:10.725 "num_base_bdevs_discovered": 4, 00:22:10.725 "num_base_bdevs_operational": 4, 00:22:10.725 "base_bdevs_list": [ 00:22:10.725 { 00:22:10.725 "name": "pt1", 00:22:10.725 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:10.725 "is_configured": true, 00:22:10.725 "data_offset": 2048, 00:22:10.725 "data_size": 63488 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "name": "pt2", 00:22:10.725 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:10.725 "is_configured": true, 00:22:10.725 "data_offset": 2048, 00:22:10.725 "data_size": 63488 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "name": "pt3", 00:22:10.725 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:10.725 "is_configured": true, 00:22:10.725 "data_offset": 2048, 00:22:10.725 "data_size": 63488 00:22:10.725 }, 00:22:10.725 { 00:22:10.725 "name": "pt4", 00:22:10.725 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:10.725 "is_configured": true, 00:22:10.725 "data_offset": 2048, 00:22:10.725 "data_size": 63488 00:22:10.725 } 00:22:10.725 ] 00:22:10.725 } 00:22:10.725 } 00:22:10.725 }' 00:22:10.725 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:10.984 pt2 00:22:10.984 pt3 00:22:10.984 pt4' 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:10.984 "name": "pt1", 00:22:10.984 "aliases": [ 00:22:10.984 "00000000-0000-0000-0000-000000000001" 00:22:10.984 ], 00:22:10.984 "product_name": "passthru", 00:22:10.984 "block_size": 512, 00:22:10.984 "num_blocks": 65536, 00:22:10.984 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:10.984 "assigned_rate_limits": { 00:22:10.984 "rw_ios_per_sec": 0, 00:22:10.984 "rw_mbytes_per_sec": 0, 00:22:10.984 "r_mbytes_per_sec": 0, 00:22:10.984 "w_mbytes_per_sec": 0 00:22:10.984 }, 00:22:10.984 "claimed": true, 00:22:10.984 "claim_type": "exclusive_write", 00:22:10.984 "zoned": false, 00:22:10.984 "supported_io_types": { 00:22:10.984 "read": true, 00:22:10.984 "write": true, 00:22:10.984 "unmap": true, 00:22:10.984 "flush": true, 00:22:10.984 "reset": true, 00:22:10.984 "nvme_admin": false, 00:22:10.984 "nvme_io": false, 00:22:10.984 "nvme_io_md": false, 00:22:10.984 "write_zeroes": true, 00:22:10.984 "zcopy": true, 00:22:10.984 "get_zone_info": false, 00:22:10.984 "zone_management": false, 00:22:10.984 "zone_append": false, 00:22:10.984 "compare": false, 00:22:10.984 "compare_and_write": false, 00:22:10.984 "abort": true, 00:22:10.984 "seek_hole": false, 00:22:10.984 "seek_data": false, 00:22:10.984 "copy": true, 00:22:10.984 "nvme_iov_md": false 00:22:10.984 }, 00:22:10.984 "memory_domains": [ 00:22:10.984 { 00:22:10.984 "dma_device_id": "system", 00:22:10.984 "dma_device_type": 1 00:22:10.984 }, 00:22:10.984 { 00:22:10.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.984 "dma_device_type": 2 00:22:10.984 } 00:22:10.984 ], 00:22:10.984 "driver_specific": { 00:22:10.984 "passthru": { 00:22:10.984 "name": "pt1", 00:22:10.984 "base_bdev_name": "malloc1" 00:22:10.984 } 00:22:10.984 } 00:22:10.984 }' 00:22:10.984 09:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.243 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.500 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.500 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.500 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.500 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:11.500 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.501 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:11.501 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.759 "name": "pt2", 00:22:11.759 "aliases": [ 00:22:11.759 "00000000-0000-0000-0000-000000000002" 00:22:11.759 ], 00:22:11.759 "product_name": "passthru", 00:22:11.759 "block_size": 512, 00:22:11.759 "num_blocks": 65536, 00:22:11.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.759 "assigned_rate_limits": { 00:22:11.759 "rw_ios_per_sec": 0, 00:22:11.759 "rw_mbytes_per_sec": 0, 00:22:11.759 "r_mbytes_per_sec": 0, 00:22:11.759 "w_mbytes_per_sec": 0 00:22:11.759 }, 00:22:11.759 "claimed": true, 00:22:11.759 "claim_type": "exclusive_write", 00:22:11.759 "zoned": false, 00:22:11.759 "supported_io_types": { 00:22:11.759 "read": true, 00:22:11.759 "write": true, 00:22:11.759 "unmap": true, 00:22:11.759 "flush": true, 00:22:11.759 "reset": true, 00:22:11.759 "nvme_admin": false, 00:22:11.759 "nvme_io": false, 00:22:11.759 "nvme_io_md": false, 00:22:11.759 "write_zeroes": true, 00:22:11.759 "zcopy": true, 00:22:11.759 "get_zone_info": false, 00:22:11.759 "zone_management": false, 00:22:11.759 "zone_append": false, 00:22:11.759 "compare": false, 00:22:11.759 "compare_and_write": false, 00:22:11.759 "abort": true, 00:22:11.759 "seek_hole": false, 00:22:11.759 "seek_data": false, 00:22:11.759 "copy": true, 00:22:11.759 "nvme_iov_md": false 00:22:11.759 }, 00:22:11.759 "memory_domains": [ 00:22:11.759 { 00:22:11.759 "dma_device_id": "system", 00:22:11.759 "dma_device_type": 1 00:22:11.759 }, 00:22:11.759 { 00:22:11.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.759 "dma_device_type": 2 00:22:11.759 } 00:22:11.759 ], 00:22:11.759 "driver_specific": { 00:22:11.759 "passthru": { 00:22:11.759 "name": "pt2", 00:22:11.759 "base_bdev_name": "malloc2" 00:22:11.759 } 00:22:11.759 } 00:22:11.759 }' 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.759 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:12.028 09:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.287 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.287 "name": "pt3", 00:22:12.287 "aliases": [ 00:22:12.287 "00000000-0000-0000-0000-000000000003" 00:22:12.287 ], 00:22:12.287 "product_name": "passthru", 00:22:12.287 "block_size": 512, 00:22:12.287 "num_blocks": 65536, 00:22:12.287 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:12.287 "assigned_rate_limits": { 00:22:12.287 "rw_ios_per_sec": 0, 00:22:12.287 "rw_mbytes_per_sec": 0, 00:22:12.287 "r_mbytes_per_sec": 0, 00:22:12.287 "w_mbytes_per_sec": 0 00:22:12.287 }, 00:22:12.287 "claimed": true, 00:22:12.287 "claim_type": "exclusive_write", 00:22:12.287 "zoned": false, 00:22:12.287 "supported_io_types": { 00:22:12.287 "read": true, 00:22:12.287 "write": true, 00:22:12.287 "unmap": true, 00:22:12.287 "flush": true, 00:22:12.287 "reset": true, 00:22:12.287 "nvme_admin": false, 00:22:12.287 "nvme_io": false, 00:22:12.287 "nvme_io_md": false, 00:22:12.287 "write_zeroes": true, 00:22:12.287 "zcopy": true, 00:22:12.287 "get_zone_info": false, 00:22:12.287 "zone_management": false, 00:22:12.287 "zone_append": false, 00:22:12.287 "compare": false, 00:22:12.287 "compare_and_write": false, 00:22:12.287 "abort": true, 00:22:12.287 "seek_hole": false, 00:22:12.287 "seek_data": false, 00:22:12.287 "copy": true, 00:22:12.287 "nvme_iov_md": false 00:22:12.287 }, 00:22:12.287 "memory_domains": [ 00:22:12.287 { 00:22:12.287 "dma_device_id": "system", 00:22:12.287 "dma_device_type": 1 00:22:12.287 }, 00:22:12.287 { 00:22:12.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.287 "dma_device_type": 2 00:22:12.287 } 00:22:12.287 ], 00:22:12.287 "driver_specific": { 00:22:12.287 "passthru": { 00:22:12.287 "name": "pt3", 00:22:12.287 "base_bdev_name": "malloc3" 00:22:12.287 } 00:22:12.287 } 00:22:12.287 }' 00:22:12.287 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.287 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.287 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.287 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.544 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.802 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.802 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.802 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:12.802 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:13.061 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:13.061 "name": "pt4", 00:22:13.061 "aliases": [ 00:22:13.061 "00000000-0000-0000-0000-000000000004" 00:22:13.061 ], 00:22:13.061 "product_name": "passthru", 00:22:13.061 "block_size": 512, 00:22:13.061 "num_blocks": 65536, 00:22:13.061 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:13.061 "assigned_rate_limits": { 00:22:13.061 "rw_ios_per_sec": 0, 00:22:13.061 "rw_mbytes_per_sec": 0, 00:22:13.061 "r_mbytes_per_sec": 0, 00:22:13.061 "w_mbytes_per_sec": 0 00:22:13.061 }, 00:22:13.061 "claimed": true, 00:22:13.061 "claim_type": "exclusive_write", 00:22:13.061 "zoned": false, 00:22:13.061 "supported_io_types": { 00:22:13.061 "read": true, 00:22:13.061 "write": true, 00:22:13.061 "unmap": true, 00:22:13.061 "flush": true, 00:22:13.061 "reset": true, 00:22:13.061 "nvme_admin": false, 00:22:13.061 "nvme_io": false, 00:22:13.061 "nvme_io_md": false, 00:22:13.061 "write_zeroes": true, 00:22:13.061 "zcopy": true, 00:22:13.061 "get_zone_info": false, 00:22:13.061 "zone_management": false, 00:22:13.061 "zone_append": false, 00:22:13.061 "compare": false, 00:22:13.061 "compare_and_write": false, 00:22:13.061 "abort": true, 00:22:13.061 "seek_hole": false, 00:22:13.061 "seek_data": false, 00:22:13.061 "copy": true, 00:22:13.061 "nvme_iov_md": false 00:22:13.061 }, 00:22:13.061 "memory_domains": [ 00:22:13.061 { 00:22:13.061 "dma_device_id": "system", 00:22:13.061 "dma_device_type": 1 00:22:13.061 }, 00:22:13.061 { 00:22:13.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.061 "dma_device_type": 2 00:22:13.061 } 00:22:13.061 ], 00:22:13.061 "driver_specific": { 00:22:13.061 "passthru": { 00:22:13.061 "name": "pt4", 00:22:13.061 "base_bdev_name": "malloc4" 00:22:13.061 } 00:22:13.061 } 00:22:13.061 }' 00:22:13.061 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.061 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.061 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:13.062 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.062 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.062 09:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:13.062 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:13.320 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:13.578 [2024-07-15 09:26:22.406344] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:13.578 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 '!=' 9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 ']' 00:22:13.578 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:13.578 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:13.578 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:13.578 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:13.837 [2024-07-15 09:26:22.642720] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.837 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.096 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.096 "name": "raid_bdev1", 00:22:14.096 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:14.096 "strip_size_kb": 0, 00:22:14.096 "state": "online", 00:22:14.096 "raid_level": "raid1", 00:22:14.096 "superblock": true, 00:22:14.096 "num_base_bdevs": 4, 00:22:14.096 "num_base_bdevs_discovered": 3, 00:22:14.096 "num_base_bdevs_operational": 3, 00:22:14.096 "base_bdevs_list": [ 00:22:14.096 { 00:22:14.096 "name": null, 00:22:14.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.096 "is_configured": false, 00:22:14.096 "data_offset": 2048, 00:22:14.096 "data_size": 63488 00:22:14.096 }, 00:22:14.096 { 00:22:14.096 "name": "pt2", 00:22:14.096 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.096 "is_configured": true, 00:22:14.096 "data_offset": 2048, 00:22:14.096 "data_size": 63488 00:22:14.096 }, 00:22:14.096 { 00:22:14.096 "name": "pt3", 00:22:14.096 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.096 "is_configured": true, 00:22:14.096 "data_offset": 2048, 00:22:14.096 "data_size": 63488 00:22:14.096 }, 00:22:14.096 { 00:22:14.096 "name": "pt4", 00:22:14.096 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:14.096 "is_configured": true, 00:22:14.096 "data_offset": 2048, 00:22:14.096 "data_size": 63488 00:22:14.096 } 00:22:14.096 ] 00:22:14.096 }' 00:22:14.096 09:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.096 09:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.662 09:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:14.921 [2024-07-15 09:26:23.697473] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:14.921 [2024-07-15 09:26:23.697504] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:14.921 [2024-07-15 09:26:23.697563] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:14.921 [2024-07-15 09:26:23.697628] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:14.921 [2024-07-15 09:26:23.697639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1803780 name raid_bdev1, state offline 00:22:14.921 09:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.921 09:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:15.487 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:15.487 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:15.487 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:15.487 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:15.487 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:16.053 09:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:16.620 09:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:16.620 09:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:16.620 09:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:16.620 09:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:16.620 09:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:17.223 [2024-07-15 09:26:25.979393] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:17.223 [2024-07-15 09:26:25.979452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:17.223 [2024-07-15 09:26:25.979474] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a6700 00:22:17.223 [2024-07-15 09:26:25.979487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:17.223 [2024-07-15 09:26:25.981219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:17.223 [2024-07-15 09:26:25.981251] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:17.223 [2024-07-15 09:26:25.981327] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:17.223 [2024-07-15 09:26:25.981363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:17.223 pt2 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.223 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.482 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.482 "name": "raid_bdev1", 00:22:17.482 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:17.482 "strip_size_kb": 0, 00:22:17.482 "state": "configuring", 00:22:17.482 "raid_level": "raid1", 00:22:17.482 "superblock": true, 00:22:17.482 "num_base_bdevs": 4, 00:22:17.482 "num_base_bdevs_discovered": 1, 00:22:17.482 "num_base_bdevs_operational": 3, 00:22:17.482 "base_bdevs_list": [ 00:22:17.482 { 00:22:17.482 "name": null, 00:22:17.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.482 "is_configured": false, 00:22:17.482 "data_offset": 2048, 00:22:17.482 "data_size": 63488 00:22:17.482 }, 00:22:17.482 { 00:22:17.482 "name": "pt2", 00:22:17.482 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:17.482 "is_configured": true, 00:22:17.482 "data_offset": 2048, 00:22:17.482 "data_size": 63488 00:22:17.482 }, 00:22:17.482 { 00:22:17.482 "name": null, 00:22:17.482 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:17.482 "is_configured": false, 00:22:17.482 "data_offset": 2048, 00:22:17.482 "data_size": 63488 00:22:17.482 }, 00:22:17.482 { 00:22:17.482 "name": null, 00:22:17.482 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:17.482 "is_configured": false, 00:22:17.482 "data_offset": 2048, 00:22:17.482 "data_size": 63488 00:22:17.482 } 00:22:17.482 ] 00:22:17.482 }' 00:22:17.482 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.482 09:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.157 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:18.157 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:18.157 09:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:18.157 [2024-07-15 09:26:27.090342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:18.157 [2024-07-15 09:26:27.090403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.157 [2024-07-15 09:26:27.090427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180ca10 00:22:18.157 [2024-07-15 09:26:27.090439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.157 [2024-07-15 09:26:27.090792] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.157 [2024-07-15 09:26:27.090810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:18.157 [2024-07-15 09:26:27.090880] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:18.157 [2024-07-15 09:26:27.090901] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:18.157 pt3 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.416 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.416 "name": "raid_bdev1", 00:22:18.416 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:18.416 "strip_size_kb": 0, 00:22:18.416 "state": "configuring", 00:22:18.417 "raid_level": "raid1", 00:22:18.417 "superblock": true, 00:22:18.417 "num_base_bdevs": 4, 00:22:18.417 "num_base_bdevs_discovered": 2, 00:22:18.417 "num_base_bdevs_operational": 3, 00:22:18.417 "base_bdevs_list": [ 00:22:18.417 { 00:22:18.417 "name": null, 00:22:18.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.417 "is_configured": false, 00:22:18.417 "data_offset": 2048, 00:22:18.417 "data_size": 63488 00:22:18.417 }, 00:22:18.417 { 00:22:18.417 "name": "pt2", 00:22:18.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:18.417 "is_configured": true, 00:22:18.417 "data_offset": 2048, 00:22:18.417 "data_size": 63488 00:22:18.417 }, 00:22:18.417 { 00:22:18.417 "name": "pt3", 00:22:18.417 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:18.417 "is_configured": true, 00:22:18.417 "data_offset": 2048, 00:22:18.417 "data_size": 63488 00:22:18.417 }, 00:22:18.417 { 00:22:18.417 "name": null, 00:22:18.417 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:18.417 "is_configured": false, 00:22:18.417 "data_offset": 2048, 00:22:18.417 "data_size": 63488 00:22:18.417 } 00:22:18.417 ] 00:22:18.417 }' 00:22:18.417 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.417 09:26:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.985 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:18.985 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:18.985 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:18.985 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:19.243 [2024-07-15 09:26:27.968685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:19.243 [2024-07-15 09:26:27.968741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.243 [2024-07-15 09:26:27.968761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19af520 00:22:19.243 [2024-07-15 09:26:27.968774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.243 [2024-07-15 09:26:27.969140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.243 [2024-07-15 09:26:27.969160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:19.243 [2024-07-15 09:26:27.969224] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:19.243 [2024-07-15 09:26:27.969244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:19.243 [2024-07-15 09:26:27.969357] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1803ea0 00:22:19.243 [2024-07-15 09:26:27.969368] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:19.243 [2024-07-15 09:26:27.969537] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1808600 00:22:19.243 [2024-07-15 09:26:27.969666] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1803ea0 00:22:19.243 [2024-07-15 09:26:27.969682] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1803ea0 00:22:19.243 [2024-07-15 09:26:27.969779] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.243 pt4 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.243 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.244 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.244 09:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.244 09:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.244 "name": "raid_bdev1", 00:22:19.244 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:19.244 "strip_size_kb": 0, 00:22:19.244 "state": "online", 00:22:19.244 "raid_level": "raid1", 00:22:19.244 "superblock": true, 00:22:19.244 "num_base_bdevs": 4, 00:22:19.244 "num_base_bdevs_discovered": 3, 00:22:19.244 "num_base_bdevs_operational": 3, 00:22:19.244 "base_bdevs_list": [ 00:22:19.244 { 00:22:19.244 "name": null, 00:22:19.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.244 "is_configured": false, 00:22:19.244 "data_offset": 2048, 00:22:19.244 "data_size": 63488 00:22:19.244 }, 00:22:19.244 { 00:22:19.244 "name": "pt2", 00:22:19.244 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:19.244 "is_configured": true, 00:22:19.244 "data_offset": 2048, 00:22:19.244 "data_size": 63488 00:22:19.244 }, 00:22:19.244 { 00:22:19.244 "name": "pt3", 00:22:19.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:19.244 "is_configured": true, 00:22:19.244 "data_offset": 2048, 00:22:19.244 "data_size": 63488 00:22:19.244 }, 00:22:19.244 { 00:22:19.244 "name": "pt4", 00:22:19.244 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:19.244 "is_configured": true, 00:22:19.244 "data_offset": 2048, 00:22:19.244 "data_size": 63488 00:22:19.244 } 00:22:19.244 ] 00:22:19.244 }' 00:22:19.244 09:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.244 09:26:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.810 09:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:20.068 [2024-07-15 09:26:28.919197] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:20.068 [2024-07-15 09:26:28.919226] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:20.068 [2024-07-15 09:26:28.919276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:20.068 [2024-07-15 09:26:28.919342] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:20.068 [2024-07-15 09:26:28.919354] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1803ea0 name raid_bdev1, state offline 00:22:20.068 09:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.068 09:26:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:20.327 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:20.327 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:20.327 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:20.327 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:20.327 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:20.585 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:20.843 [2024-07-15 09:26:29.657116] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:20.843 [2024-07-15 09:26:29.657162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.843 [2024-07-15 09:26:29.657179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19af520 00:22:20.843 [2024-07-15 09:26:29.657192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.843 [2024-07-15 09:26:29.658796] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.843 [2024-07-15 09:26:29.658825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:20.843 [2024-07-15 09:26:29.658889] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:20.843 [2024-07-15 09:26:29.658916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:20.843 [2024-07-15 09:26:29.659032] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:20.843 [2024-07-15 09:26:29.659047] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:20.843 [2024-07-15 09:26:29.659061] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1803060 name raid_bdev1, state configuring 00:22:20.843 [2024-07-15 09:26:29.659085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:20.843 [2024-07-15 09:26:29.659159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:20.843 pt1 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.843 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.101 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.101 "name": "raid_bdev1", 00:22:21.101 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:21.101 "strip_size_kb": 0, 00:22:21.101 "state": "configuring", 00:22:21.101 "raid_level": "raid1", 00:22:21.101 "superblock": true, 00:22:21.101 "num_base_bdevs": 4, 00:22:21.101 "num_base_bdevs_discovered": 2, 00:22:21.101 "num_base_bdevs_operational": 3, 00:22:21.101 "base_bdevs_list": [ 00:22:21.101 { 00:22:21.101 "name": null, 00:22:21.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.101 "is_configured": false, 00:22:21.101 "data_offset": 2048, 00:22:21.101 "data_size": 63488 00:22:21.101 }, 00:22:21.101 { 00:22:21.101 "name": "pt2", 00:22:21.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:21.101 "is_configured": true, 00:22:21.101 "data_offset": 2048, 00:22:21.101 "data_size": 63488 00:22:21.101 }, 00:22:21.101 { 00:22:21.101 "name": "pt3", 00:22:21.101 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:21.101 "is_configured": true, 00:22:21.101 "data_offset": 2048, 00:22:21.101 "data_size": 63488 00:22:21.101 }, 00:22:21.101 { 00:22:21.101 "name": null, 00:22:21.101 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:21.101 "is_configured": false, 00:22:21.101 "data_offset": 2048, 00:22:21.101 "data_size": 63488 00:22:21.101 } 00:22:21.101 ] 00:22:21.101 }' 00:22:21.101 09:26:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.101 09:26:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.666 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:21.666 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:21.924 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:21.924 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:22.181 [2024-07-15 09:26:30.948540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:22.181 [2024-07-15 09:26:30.948593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.181 [2024-07-15 09:26:30.948613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1803310 00:22:22.181 [2024-07-15 09:26:30.948625] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.181 [2024-07-15 09:26:30.948988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.181 [2024-07-15 09:26:30.949007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:22.181 [2024-07-15 09:26:30.949071] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:22.181 [2024-07-15 09:26:30.949090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:22.181 [2024-07-15 09:26:30.949203] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1806b40 00:22:22.181 [2024-07-15 09:26:30.949213] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:22.181 [2024-07-15 09:26:30.949392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a6990 00:22:22.181 [2024-07-15 09:26:30.949524] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1806b40 00:22:22.181 [2024-07-15 09:26:30.949534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1806b40 00:22:22.181 [2024-07-15 09:26:30.949631] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.181 pt4 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.181 09:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.439 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.439 "name": "raid_bdev1", 00:22:22.439 "uuid": "9b872258-5a11-4d2b-a8bd-8d0cb3b535d2", 00:22:22.439 "strip_size_kb": 0, 00:22:22.439 "state": "online", 00:22:22.439 "raid_level": "raid1", 00:22:22.439 "superblock": true, 00:22:22.439 "num_base_bdevs": 4, 00:22:22.439 "num_base_bdevs_discovered": 3, 00:22:22.439 "num_base_bdevs_operational": 3, 00:22:22.439 "base_bdevs_list": [ 00:22:22.439 { 00:22:22.439 "name": null, 00:22:22.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.439 "is_configured": false, 00:22:22.439 "data_offset": 2048, 00:22:22.439 "data_size": 63488 00:22:22.439 }, 00:22:22.439 { 00:22:22.439 "name": "pt2", 00:22:22.439 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:22.439 "is_configured": true, 00:22:22.439 "data_offset": 2048, 00:22:22.439 "data_size": 63488 00:22:22.439 }, 00:22:22.439 { 00:22:22.439 "name": "pt3", 00:22:22.439 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:22.439 "is_configured": true, 00:22:22.439 "data_offset": 2048, 00:22:22.439 "data_size": 63488 00:22:22.439 }, 00:22:22.439 { 00:22:22.439 "name": "pt4", 00:22:22.439 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:22.439 "is_configured": true, 00:22:22.439 "data_offset": 2048, 00:22:22.439 "data_size": 63488 00:22:22.439 } 00:22:22.439 ] 00:22:22.439 }' 00:22:22.439 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.439 09:26:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.003 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:23.003 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:23.260 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:23.260 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:23.260 09:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:23.518 [2024-07-15 09:26:32.220220] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 '!=' 9b872258-5a11-4d2b-a8bd-8d0cb3b535d2 ']' 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 184586 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 184586 ']' 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 184586 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 184586 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 184586' 00:22:23.518 killing process with pid 184586 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 184586 00:22:23.518 [2024-07-15 09:26:32.288965] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:23.518 [2024-07-15 09:26:32.289021] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:23.518 [2024-07-15 09:26:32.289088] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:23.518 [2024-07-15 09:26:32.289100] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1806b40 name raid_bdev1, state offline 00:22:23.518 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 184586 00:22:23.518 [2024-07-15 09:26:32.330999] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:23.775 09:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:23.775 00:22:23.775 real 0m27.176s 00:22:23.775 user 0m49.865s 00:22:23.775 sys 0m4.652s 00:22:23.775 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:23.775 09:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.775 ************************************ 00:22:23.775 END TEST raid_superblock_test 00:22:23.775 ************************************ 00:22:23.775 09:26:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:23.775 09:26:32 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:23.775 09:26:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:23.775 09:26:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:23.775 09:26:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:23.775 ************************************ 00:22:23.775 START TEST raid_read_error_test 00:22:23.775 ************************************ 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0pZRjVwH9J 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=188622 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 188622 /var/tmp/spdk-raid.sock 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 188622 ']' 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:23.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:23.775 09:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.776 [2024-07-15 09:26:32.716215] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:22:23.776 [2024-07-15 09:26:32.716279] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid188622 ] 00:22:24.033 [2024-07-15 09:26:32.846770] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:24.033 [2024-07-15 09:26:32.952311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:24.290 [2024-07-15 09:26:33.007394] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.290 [2024-07-15 09:26:33.007426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.853 09:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:24.853 09:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:24.853 09:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:24.853 09:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:25.110 BaseBdev1_malloc 00:22:25.110 09:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:25.368 true 00:22:25.368 09:26:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:25.626 [2024-07-15 09:26:34.365990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:25.626 [2024-07-15 09:26:34.366034] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.626 [2024-07-15 09:26:34.366055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e80d0 00:22:25.626 [2024-07-15 09:26:34.366067] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.626 [2024-07-15 09:26:34.367957] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.626 [2024-07-15 09:26:34.367986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:25.626 BaseBdev1 00:22:25.626 09:26:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:25.626 09:26:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:25.884 BaseBdev2_malloc 00:22:25.884 09:26:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:26.142 true 00:22:26.142 09:26:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:26.399 [2024-07-15 09:26:35.105747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:26.399 [2024-07-15 09:26:35.105790] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.399 [2024-07-15 09:26:35.105811] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ec910 00:22:26.399 [2024-07-15 09:26:35.105823] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.399 [2024-07-15 09:26:35.107377] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.399 [2024-07-15 09:26:35.107405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:26.399 BaseBdev2 00:22:26.399 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:26.399 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:26.656 BaseBdev3_malloc 00:22:26.656 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:26.656 true 00:22:26.914 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:26.914 [2024-07-15 09:26:35.844286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:26.914 [2024-07-15 09:26:35.844329] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.914 [2024-07-15 09:26:35.844349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26eebd0 00:22:26.914 [2024-07-15 09:26:35.844362] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.914 [2024-07-15 09:26:35.845955] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.914 [2024-07-15 09:26:35.845983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:26.914 BaseBdev3 00:22:26.914 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:26.914 09:26:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:27.171 BaseBdev4_malloc 00:22:27.171 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:27.429 true 00:22:27.429 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:27.686 [2024-07-15 09:26:36.574822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:27.686 [2024-07-15 09:26:36.574866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.686 [2024-07-15 09:26:36.574888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26efaa0 00:22:27.686 [2024-07-15 09:26:36.574900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.686 [2024-07-15 09:26:36.576478] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.686 [2024-07-15 09:26:36.576505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:27.686 BaseBdev4 00:22:27.687 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:27.944 [2024-07-15 09:26:36.807472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.944 [2024-07-15 09:26:36.808844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:27.944 [2024-07-15 09:26:36.808913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:27.944 [2024-07-15 09:26:36.808983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:27.944 [2024-07-15 09:26:36.809224] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26e9c20 00:22:27.944 [2024-07-15 09:26:36.809236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:27.944 [2024-07-15 09:26:36.809430] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253e260 00:22:27.944 [2024-07-15 09:26:36.809591] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26e9c20 00:22:27.944 [2024-07-15 09:26:36.809601] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26e9c20 00:22:27.944 [2024-07-15 09:26:36.809716] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.944 09:26:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.202 09:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.202 "name": "raid_bdev1", 00:22:28.202 "uuid": "c07954c5-2827-42c8-8c0c-3c83a607dd01", 00:22:28.202 "strip_size_kb": 0, 00:22:28.202 "state": "online", 00:22:28.202 "raid_level": "raid1", 00:22:28.202 "superblock": true, 00:22:28.202 "num_base_bdevs": 4, 00:22:28.202 "num_base_bdevs_discovered": 4, 00:22:28.202 "num_base_bdevs_operational": 4, 00:22:28.202 "base_bdevs_list": [ 00:22:28.202 { 00:22:28.202 "name": "BaseBdev1", 00:22:28.202 "uuid": "e0c49821-0b5c-58d1-b1cb-7ac116966423", 00:22:28.202 "is_configured": true, 00:22:28.202 "data_offset": 2048, 00:22:28.202 "data_size": 63488 00:22:28.202 }, 00:22:28.202 { 00:22:28.202 "name": "BaseBdev2", 00:22:28.202 "uuid": "2dc25c84-478d-5654-9b6b-4ce5f616c71a", 00:22:28.202 "is_configured": true, 00:22:28.202 "data_offset": 2048, 00:22:28.202 "data_size": 63488 00:22:28.202 }, 00:22:28.202 { 00:22:28.202 "name": "BaseBdev3", 00:22:28.202 "uuid": "c48fee4c-4b82-5085-9534-dd2f77359a00", 00:22:28.202 "is_configured": true, 00:22:28.202 "data_offset": 2048, 00:22:28.202 "data_size": 63488 00:22:28.202 }, 00:22:28.202 { 00:22:28.202 "name": "BaseBdev4", 00:22:28.202 "uuid": "80641fe6-a3b0-5e13-b5bf-4c698773924e", 00:22:28.202 "is_configured": true, 00:22:28.202 "data_offset": 2048, 00:22:28.202 "data_size": 63488 00:22:28.202 } 00:22:28.202 ] 00:22:28.202 }' 00:22:28.202 09:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.202 09:26:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.765 09:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:28.765 09:26:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:29.023 [2024-07-15 09:26:37.766280] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253dc60 00:22:29.956 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.213 09:26:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.471 09:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.471 "name": "raid_bdev1", 00:22:30.471 "uuid": "c07954c5-2827-42c8-8c0c-3c83a607dd01", 00:22:30.471 "strip_size_kb": 0, 00:22:30.471 "state": "online", 00:22:30.471 "raid_level": "raid1", 00:22:30.471 "superblock": true, 00:22:30.471 "num_base_bdevs": 4, 00:22:30.471 "num_base_bdevs_discovered": 4, 00:22:30.471 "num_base_bdevs_operational": 4, 00:22:30.471 "base_bdevs_list": [ 00:22:30.471 { 00:22:30.471 "name": "BaseBdev1", 00:22:30.471 "uuid": "e0c49821-0b5c-58d1-b1cb-7ac116966423", 00:22:30.471 "is_configured": true, 00:22:30.471 "data_offset": 2048, 00:22:30.471 "data_size": 63488 00:22:30.471 }, 00:22:30.471 { 00:22:30.471 "name": "BaseBdev2", 00:22:30.471 "uuid": "2dc25c84-478d-5654-9b6b-4ce5f616c71a", 00:22:30.471 "is_configured": true, 00:22:30.471 "data_offset": 2048, 00:22:30.471 "data_size": 63488 00:22:30.471 }, 00:22:30.471 { 00:22:30.471 "name": "BaseBdev3", 00:22:30.471 "uuid": "c48fee4c-4b82-5085-9534-dd2f77359a00", 00:22:30.471 "is_configured": true, 00:22:30.471 "data_offset": 2048, 00:22:30.471 "data_size": 63488 00:22:30.471 }, 00:22:30.471 { 00:22:30.471 "name": "BaseBdev4", 00:22:30.471 "uuid": "80641fe6-a3b0-5e13-b5bf-4c698773924e", 00:22:30.471 "is_configured": true, 00:22:30.471 "data_offset": 2048, 00:22:30.471 "data_size": 63488 00:22:30.471 } 00:22:30.471 ] 00:22:30.471 }' 00:22:30.471 09:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.471 09:26:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:31.036 09:26:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:31.295 [2024-07-15 09:26:40.004198] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:31.295 [2024-07-15 09:26:40.004236] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:31.295 [2024-07-15 09:26:40.007750] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:31.295 [2024-07-15 09:26:40.007795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:31.295 [2024-07-15 09:26:40.007917] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:31.295 [2024-07-15 09:26:40.007941] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e9c20 name raid_bdev1, state offline 00:22:31.295 0 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 188622 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 188622 ']' 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 188622 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 188622 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 188622' 00:22:31.295 killing process with pid 188622 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 188622 00:22:31.295 [2024-07-15 09:26:40.074588] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:31.295 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 188622 00:22:31.295 [2024-07-15 09:26:40.105356] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0pZRjVwH9J 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:31.554 00:22:31.554 real 0m7.702s 00:22:31.554 user 0m12.373s 00:22:31.554 sys 0m1.326s 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:31.554 09:26:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:31.554 ************************************ 00:22:31.554 END TEST raid_read_error_test 00:22:31.554 ************************************ 00:22:31.554 09:26:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:31.554 09:26:40 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:31.554 09:26:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:31.554 09:26:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:31.554 09:26:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:31.554 ************************************ 00:22:31.554 START TEST raid_write_error_test 00:22:31.554 ************************************ 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.m1b1Idb9Vi 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=189770 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 189770 /var/tmp/spdk-raid.sock 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 189770 ']' 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:31.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:31.554 09:26:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:31.812 [2024-07-15 09:26:40.508274] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:22:31.812 [2024-07-15 09:26:40.508347] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid189770 ] 00:22:31.812 [2024-07-15 09:26:40.641994] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:31.812 [2024-07-15 09:26:40.743287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:32.069 [2024-07-15 09:26:40.806879] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:32.069 [2024-07-15 09:26:40.806916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:32.634 09:26:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:32.634 09:26:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:32.634 09:26:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:32.634 09:26:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:32.892 BaseBdev1_malloc 00:22:32.892 09:26:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:33.150 true 00:22:33.150 09:26:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:33.407 [2024-07-15 09:26:42.153801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:33.407 [2024-07-15 09:26:42.153844] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:33.407 [2024-07-15 09:26:42.153863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e6e0d0 00:22:33.407 [2024-07-15 09:26:42.153875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:33.407 [2024-07-15 09:26:42.155585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:33.407 [2024-07-15 09:26:42.155613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:33.407 BaseBdev1 00:22:33.407 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:33.407 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:33.664 BaseBdev2_malloc 00:22:33.665 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:33.921 true 00:22:33.921 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:34.177 [2024-07-15 09:26:42.892294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:34.177 [2024-07-15 09:26:42.892335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:34.177 [2024-07-15 09:26:42.892356] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e72910 00:22:34.177 [2024-07-15 09:26:42.892368] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:34.177 [2024-07-15 09:26:42.893787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:34.177 [2024-07-15 09:26:42.893814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:34.177 BaseBdev2 00:22:34.177 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:34.177 09:26:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:34.433 BaseBdev3_malloc 00:22:34.433 09:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:34.690 true 00:22:34.690 09:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:34.690 [2024-07-15 09:26:43.622774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:34.690 [2024-07-15 09:26:43.622816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:34.690 [2024-07-15 09:26:43.622835] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e74bd0 00:22:34.690 [2024-07-15 09:26:43.622847] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:34.690 [2024-07-15 09:26:43.624233] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:34.690 [2024-07-15 09:26:43.624258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:34.690 BaseBdev3 00:22:34.946 09:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:34.946 09:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:34.946 BaseBdev4_malloc 00:22:34.946 09:26:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:35.202 true 00:22:35.202 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:35.458 [2024-07-15 09:26:44.349157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:35.458 [2024-07-15 09:26:44.349200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:35.458 [2024-07-15 09:26:44.349219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e75aa0 00:22:35.458 [2024-07-15 09:26:44.349231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:35.458 [2024-07-15 09:26:44.350620] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:35.458 [2024-07-15 09:26:44.350647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:35.458 BaseBdev4 00:22:35.458 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:35.715 [2024-07-15 09:26:44.593834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:35.715 [2024-07-15 09:26:44.595062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:35.715 [2024-07-15 09:26:44.595128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:35.715 [2024-07-15 09:26:44.595189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:35.715 [2024-07-15 09:26:44.595413] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e6fc20 00:22:35.715 [2024-07-15 09:26:44.595424] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:35.715 [2024-07-15 09:26:44.595606] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc4260 00:22:35.715 [2024-07-15 09:26:44.595756] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e6fc20 00:22:35.715 [2024-07-15 09:26:44.595767] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e6fc20 00:22:35.715 [2024-07-15 09:26:44.595866] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.715 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.971 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.971 "name": "raid_bdev1", 00:22:35.971 "uuid": "1c0f2e39-5319-46af-878a-1af76443ceeb", 00:22:35.971 "strip_size_kb": 0, 00:22:35.971 "state": "online", 00:22:35.971 "raid_level": "raid1", 00:22:35.971 "superblock": true, 00:22:35.971 "num_base_bdevs": 4, 00:22:35.971 "num_base_bdevs_discovered": 4, 00:22:35.971 "num_base_bdevs_operational": 4, 00:22:35.971 "base_bdevs_list": [ 00:22:35.971 { 00:22:35.971 "name": "BaseBdev1", 00:22:35.971 "uuid": "4e6e7d26-abf7-55a5-89ed-3e6f8676e950", 00:22:35.971 "is_configured": true, 00:22:35.971 "data_offset": 2048, 00:22:35.971 "data_size": 63488 00:22:35.971 }, 00:22:35.971 { 00:22:35.971 "name": "BaseBdev2", 00:22:35.971 "uuid": "e22e1aa0-7585-501a-ace7-8ed4e04a1229", 00:22:35.971 "is_configured": true, 00:22:35.971 "data_offset": 2048, 00:22:35.971 "data_size": 63488 00:22:35.971 }, 00:22:35.971 { 00:22:35.971 "name": "BaseBdev3", 00:22:35.971 "uuid": "e4122feb-5864-56c1-97c1-0b35874f6643", 00:22:35.971 "is_configured": true, 00:22:35.971 "data_offset": 2048, 00:22:35.971 "data_size": 63488 00:22:35.971 }, 00:22:35.971 { 00:22:35.971 "name": "BaseBdev4", 00:22:35.971 "uuid": "849a92f9-7327-5c0d-8a54-89d57a84d416", 00:22:35.971 "is_configured": true, 00:22:35.971 "data_offset": 2048, 00:22:35.971 "data_size": 63488 00:22:35.971 } 00:22:35.971 ] 00:22:35.971 }' 00:22:35.971 09:26:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.971 09:26:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:36.532 09:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:36.532 09:26:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:36.789 [2024-07-15 09:26:45.560939] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc3c60 00:22:37.721 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:37.978 [2024-07-15 09:26:46.685157] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:37.978 [2024-07-15 09:26:46.685216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:37.978 [2024-07-15 09:26:46.685432] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cc3c60 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.978 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.236 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.236 "name": "raid_bdev1", 00:22:38.236 "uuid": "1c0f2e39-5319-46af-878a-1af76443ceeb", 00:22:38.236 "strip_size_kb": 0, 00:22:38.236 "state": "online", 00:22:38.236 "raid_level": "raid1", 00:22:38.236 "superblock": true, 00:22:38.236 "num_base_bdevs": 4, 00:22:38.236 "num_base_bdevs_discovered": 3, 00:22:38.236 "num_base_bdevs_operational": 3, 00:22:38.236 "base_bdevs_list": [ 00:22:38.236 { 00:22:38.236 "name": null, 00:22:38.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.236 "is_configured": false, 00:22:38.236 "data_offset": 2048, 00:22:38.236 "data_size": 63488 00:22:38.236 }, 00:22:38.236 { 00:22:38.236 "name": "BaseBdev2", 00:22:38.236 "uuid": "e22e1aa0-7585-501a-ace7-8ed4e04a1229", 00:22:38.236 "is_configured": true, 00:22:38.236 "data_offset": 2048, 00:22:38.236 "data_size": 63488 00:22:38.236 }, 00:22:38.236 { 00:22:38.236 "name": "BaseBdev3", 00:22:38.236 "uuid": "e4122feb-5864-56c1-97c1-0b35874f6643", 00:22:38.236 "is_configured": true, 00:22:38.236 "data_offset": 2048, 00:22:38.236 "data_size": 63488 00:22:38.236 }, 00:22:38.236 { 00:22:38.236 "name": "BaseBdev4", 00:22:38.236 "uuid": "849a92f9-7327-5c0d-8a54-89d57a84d416", 00:22:38.236 "is_configured": true, 00:22:38.236 "data_offset": 2048, 00:22:38.236 "data_size": 63488 00:22:38.236 } 00:22:38.236 ] 00:22:38.236 }' 00:22:38.236 09:26:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.236 09:26:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:38.801 [2024-07-15 09:26:47.719986] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:38.801 [2024-07-15 09:26:47.720026] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:38.801 [2024-07-15 09:26:47.723179] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.801 [2024-07-15 09:26:47.723214] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:38.801 [2024-07-15 09:26:47.723310] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:38.801 [2024-07-15 09:26:47.723322] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6fc20 name raid_bdev1, state offline 00:22:38.801 0 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 189770 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 189770 ']' 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 189770 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:38.801 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 189770 00:22:39.059 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:39.059 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:39.059 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 189770' 00:22:39.059 killing process with pid 189770 00:22:39.059 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 189770 00:22:39.059 [2024-07-15 09:26:47.780485] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:39.059 09:26:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 189770 00:22:39.059 [2024-07-15 09:26:47.811782] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.m1b1Idb9Vi 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:39.317 00:22:39.317 real 0m7.615s 00:22:39.317 user 0m12.180s 00:22:39.317 sys 0m1.355s 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:39.317 09:26:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:39.317 ************************************ 00:22:39.317 END TEST raid_write_error_test 00:22:39.317 ************************************ 00:22:39.317 09:26:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:39.317 09:26:48 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:39.317 09:26:48 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:39.317 09:26:48 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:39.317 09:26:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:39.317 09:26:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:39.317 09:26:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:39.317 ************************************ 00:22:39.317 START TEST raid_rebuild_test 00:22:39.317 ************************************ 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=190760 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 190760 /var/tmp/spdk-raid.sock 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 190760 ']' 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:39.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:39.317 09:26:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:39.317 [2024-07-15 09:26:48.194625] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:22:39.317 [2024-07-15 09:26:48.194676] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190760 ] 00:22:39.317 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:39.317 Zero copy mechanism will not be used. 00:22:39.575 [2024-07-15 09:26:48.309727] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:39.575 [2024-07-15 09:26:48.407497] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:39.575 [2024-07-15 09:26:48.464756] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:39.575 [2024-07-15 09:26:48.464794] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:40.140 09:26:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:40.140 09:26:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:40.140 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:40.140 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:40.398 BaseBdev1_malloc 00:22:40.398 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:40.656 [2024-07-15 09:26:49.543865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:40.656 [2024-07-15 09:26:49.543914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.656 [2024-07-15 09:26:49.543941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe7d40 00:22:40.656 [2024-07-15 09:26:49.543954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.656 [2024-07-15 09:26:49.545518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.656 [2024-07-15 09:26:49.545546] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:40.656 BaseBdev1 00:22:40.656 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:40.656 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:40.915 BaseBdev2_malloc 00:22:40.915 09:26:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:41.173 [2024-07-15 09:26:50.026012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:41.173 [2024-07-15 09:26:50.026064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.173 [2024-07-15 09:26:50.026092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe8860 00:22:41.173 [2024-07-15 09:26:50.026108] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.173 [2024-07-15 09:26:50.027678] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.173 [2024-07-15 09:26:50.027707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:41.173 BaseBdev2 00:22:41.173 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:41.432 spare_malloc 00:22:41.432 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:41.690 spare_delay 00:22:41.690 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:41.948 [2024-07-15 09:26:50.740645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:41.948 [2024-07-15 09:26:50.740693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.948 [2024-07-15 09:26:50.740715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd96ec0 00:22:41.948 [2024-07-15 09:26:50.740728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.948 [2024-07-15 09:26:50.742323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.948 [2024-07-15 09:26:50.742352] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:41.948 spare 00:22:41.948 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:42.207 [2024-07-15 09:26:50.905117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:42.207 [2024-07-15 09:26:50.906481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:42.207 [2024-07-15 09:26:50.906560] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd98070 00:22:42.207 [2024-07-15 09:26:50.906571] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:42.207 [2024-07-15 09:26:50.906795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd91490 00:22:42.207 [2024-07-15 09:26:50.906956] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd98070 00:22:42.207 [2024-07-15 09:26:50.906967] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd98070 00:22:42.207 [2024-07-15 09:26:50.907090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.207 09:26:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.207 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.207 "name": "raid_bdev1", 00:22:42.207 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:42.207 "strip_size_kb": 0, 00:22:42.207 "state": "online", 00:22:42.207 "raid_level": "raid1", 00:22:42.207 "superblock": false, 00:22:42.207 "num_base_bdevs": 2, 00:22:42.207 "num_base_bdevs_discovered": 2, 00:22:42.207 "num_base_bdevs_operational": 2, 00:22:42.207 "base_bdevs_list": [ 00:22:42.207 { 00:22:42.207 "name": "BaseBdev1", 00:22:42.207 "uuid": "05143a1e-52eb-52b1-805b-8e1c173d56eb", 00:22:42.207 "is_configured": true, 00:22:42.207 "data_offset": 0, 00:22:42.207 "data_size": 65536 00:22:42.207 }, 00:22:42.207 { 00:22:42.207 "name": "BaseBdev2", 00:22:42.207 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:42.207 "is_configured": true, 00:22:42.207 "data_offset": 0, 00:22:42.207 "data_size": 65536 00:22:42.207 } 00:22:42.207 ] 00:22:42.207 }' 00:22:42.207 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.207 09:26:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:42.773 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:42.773 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:43.030 [2024-07-15 09:26:51.928026] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:43.030 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:43.030 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.030 09:26:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.288 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:43.546 [2024-07-15 09:26:52.421137] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd91490 00:22:43.546 /dev/nbd0 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:43.546 1+0 records in 00:22:43.546 1+0 records out 00:22:43.546 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222255 s, 18.4 MB/s 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:43.546 09:26:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:48.924 65536+0 records in 00:22:48.925 65536+0 records out 00:22:48.925 33554432 bytes (34 MB, 32 MiB) copied, 5.16425 s, 6.5 MB/s 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:48.925 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:49.183 [2024-07-15 09:26:57.925115] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:49.183 09:26:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:49.442 [2024-07-15 09:26:58.149768] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.442 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.700 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.700 "name": "raid_bdev1", 00:22:49.700 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:49.700 "strip_size_kb": 0, 00:22:49.700 "state": "online", 00:22:49.700 "raid_level": "raid1", 00:22:49.700 "superblock": false, 00:22:49.700 "num_base_bdevs": 2, 00:22:49.700 "num_base_bdevs_discovered": 1, 00:22:49.700 "num_base_bdevs_operational": 1, 00:22:49.700 "base_bdevs_list": [ 00:22:49.700 { 00:22:49.700 "name": null, 00:22:49.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.700 "is_configured": false, 00:22:49.700 "data_offset": 0, 00:22:49.700 "data_size": 65536 00:22:49.700 }, 00:22:49.700 { 00:22:49.700 "name": "BaseBdev2", 00:22:49.700 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:49.700 "is_configured": true, 00:22:49.700 "data_offset": 0, 00:22:49.700 "data_size": 65536 00:22:49.700 } 00:22:49.700 ] 00:22:49.700 }' 00:22:49.700 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.700 09:26:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.268 09:26:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:50.268 [2024-07-15 09:26:59.220598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:50.527 [2024-07-15 09:26:59.225875] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd98880 00:22:50.527 [2024-07-15 09:26:59.228157] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:50.527 09:26:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.463 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.722 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:51.722 "name": "raid_bdev1", 00:22:51.723 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:51.723 "strip_size_kb": 0, 00:22:51.723 "state": "online", 00:22:51.723 "raid_level": "raid1", 00:22:51.723 "superblock": false, 00:22:51.723 "num_base_bdevs": 2, 00:22:51.723 "num_base_bdevs_discovered": 2, 00:22:51.723 "num_base_bdevs_operational": 2, 00:22:51.723 "process": { 00:22:51.723 "type": "rebuild", 00:22:51.723 "target": "spare", 00:22:51.723 "progress": { 00:22:51.723 "blocks": 24576, 00:22:51.723 "percent": 37 00:22:51.723 } 00:22:51.723 }, 00:22:51.723 "base_bdevs_list": [ 00:22:51.723 { 00:22:51.723 "name": "spare", 00:22:51.723 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:51.723 "is_configured": true, 00:22:51.723 "data_offset": 0, 00:22:51.723 "data_size": 65536 00:22:51.723 }, 00:22:51.723 { 00:22:51.723 "name": "BaseBdev2", 00:22:51.723 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:51.723 "is_configured": true, 00:22:51.723 "data_offset": 0, 00:22:51.723 "data_size": 65536 00:22:51.723 } 00:22:51.723 ] 00:22:51.723 }' 00:22:51.723 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:51.723 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:51.723 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:51.723 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:51.723 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:51.982 [2024-07-15 09:27:00.818129] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.982 [2024-07-15 09:27:00.840622] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:51.982 [2024-07-15 09:27:00.840671] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.982 [2024-07-15 09:27:00.840686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.982 [2024-07-15 09:27:00.840695] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.982 09:27:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.240 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.240 "name": "raid_bdev1", 00:22:52.240 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:52.240 "strip_size_kb": 0, 00:22:52.240 "state": "online", 00:22:52.240 "raid_level": "raid1", 00:22:52.240 "superblock": false, 00:22:52.240 "num_base_bdevs": 2, 00:22:52.240 "num_base_bdevs_discovered": 1, 00:22:52.240 "num_base_bdevs_operational": 1, 00:22:52.240 "base_bdevs_list": [ 00:22:52.240 { 00:22:52.240 "name": null, 00:22:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.240 "is_configured": false, 00:22:52.240 "data_offset": 0, 00:22:52.240 "data_size": 65536 00:22:52.240 }, 00:22:52.240 { 00:22:52.240 "name": "BaseBdev2", 00:22:52.240 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:52.240 "is_configured": true, 00:22:52.240 "data_offset": 0, 00:22:52.240 "data_size": 65536 00:22:52.240 } 00:22:52.240 ] 00:22:52.240 }' 00:22:52.240 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.240 09:27:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.805 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.063 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:53.063 "name": "raid_bdev1", 00:22:53.063 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:53.063 "strip_size_kb": 0, 00:22:53.063 "state": "online", 00:22:53.063 "raid_level": "raid1", 00:22:53.063 "superblock": false, 00:22:53.063 "num_base_bdevs": 2, 00:22:53.063 "num_base_bdevs_discovered": 1, 00:22:53.063 "num_base_bdevs_operational": 1, 00:22:53.063 "base_bdevs_list": [ 00:22:53.063 { 00:22:53.063 "name": null, 00:22:53.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:53.063 "is_configured": false, 00:22:53.063 "data_offset": 0, 00:22:53.063 "data_size": 65536 00:22:53.063 }, 00:22:53.063 { 00:22:53.063 "name": "BaseBdev2", 00:22:53.063 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:53.063 "is_configured": true, 00:22:53.063 "data_offset": 0, 00:22:53.063 "data_size": 65536 00:22:53.063 } 00:22:53.063 ] 00:22:53.063 }' 00:22:53.063 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:53.063 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:53.063 09:27:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:53.320 09:27:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:53.320 09:27:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:53.320 [2024-07-15 09:27:02.249614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:53.320 [2024-07-15 09:27:02.254553] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd91490 00:22:53.320 [2024-07-15 09:27:02.256078] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:53.320 09:27:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.692 "name": "raid_bdev1", 00:22:54.692 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:54.692 "strip_size_kb": 0, 00:22:54.692 "state": "online", 00:22:54.692 "raid_level": "raid1", 00:22:54.692 "superblock": false, 00:22:54.692 "num_base_bdevs": 2, 00:22:54.692 "num_base_bdevs_discovered": 2, 00:22:54.692 "num_base_bdevs_operational": 2, 00:22:54.692 "process": { 00:22:54.692 "type": "rebuild", 00:22:54.692 "target": "spare", 00:22:54.692 "progress": { 00:22:54.692 "blocks": 24576, 00:22:54.692 "percent": 37 00:22:54.692 } 00:22:54.692 }, 00:22:54.692 "base_bdevs_list": [ 00:22:54.692 { 00:22:54.692 "name": "spare", 00:22:54.692 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:54.692 "is_configured": true, 00:22:54.692 "data_offset": 0, 00:22:54.692 "data_size": 65536 00:22:54.692 }, 00:22:54.692 { 00:22:54.692 "name": "BaseBdev2", 00:22:54.692 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:54.692 "is_configured": true, 00:22:54.692 "data_offset": 0, 00:22:54.692 "data_size": 65536 00:22:54.692 } 00:22:54.692 ] 00:22:54.692 }' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=767 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.692 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.950 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.950 "name": "raid_bdev1", 00:22:54.950 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:54.950 "strip_size_kb": 0, 00:22:54.950 "state": "online", 00:22:54.950 "raid_level": "raid1", 00:22:54.950 "superblock": false, 00:22:54.950 "num_base_bdevs": 2, 00:22:54.950 "num_base_bdevs_discovered": 2, 00:22:54.950 "num_base_bdevs_operational": 2, 00:22:54.950 "process": { 00:22:54.950 "type": "rebuild", 00:22:54.950 "target": "spare", 00:22:54.950 "progress": { 00:22:54.950 "blocks": 30720, 00:22:54.950 "percent": 46 00:22:54.950 } 00:22:54.950 }, 00:22:54.950 "base_bdevs_list": [ 00:22:54.950 { 00:22:54.950 "name": "spare", 00:22:54.950 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:54.950 "is_configured": true, 00:22:54.950 "data_offset": 0, 00:22:54.950 "data_size": 65536 00:22:54.950 }, 00:22:54.950 { 00:22:54.950 "name": "BaseBdev2", 00:22:54.950 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:54.950 "is_configured": true, 00:22:54.950 "data_offset": 0, 00:22:54.950 "data_size": 65536 00:22:54.950 } 00:22:54.950 ] 00:22:54.950 }' 00:22:54.950 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:55.208 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:55.208 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:55.208 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:55.208 09:27:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.139 09:27:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.396 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.396 "name": "raid_bdev1", 00:22:56.396 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:56.396 "strip_size_kb": 0, 00:22:56.396 "state": "online", 00:22:56.396 "raid_level": "raid1", 00:22:56.396 "superblock": false, 00:22:56.396 "num_base_bdevs": 2, 00:22:56.396 "num_base_bdevs_discovered": 2, 00:22:56.396 "num_base_bdevs_operational": 2, 00:22:56.396 "process": { 00:22:56.397 "type": "rebuild", 00:22:56.397 "target": "spare", 00:22:56.397 "progress": { 00:22:56.397 "blocks": 59392, 00:22:56.397 "percent": 90 00:22:56.397 } 00:22:56.397 }, 00:22:56.397 "base_bdevs_list": [ 00:22:56.397 { 00:22:56.397 "name": "spare", 00:22:56.397 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:56.397 "is_configured": true, 00:22:56.397 "data_offset": 0, 00:22:56.397 "data_size": 65536 00:22:56.397 }, 00:22:56.397 { 00:22:56.397 "name": "BaseBdev2", 00:22:56.397 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:56.397 "is_configured": true, 00:22:56.397 "data_offset": 0, 00:22:56.397 "data_size": 65536 00:22:56.397 } 00:22:56.397 ] 00:22:56.397 }' 00:22:56.397 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.397 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.397 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:56.397 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:56.397 09:27:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:56.653 [2024-07-15 09:27:05.481111] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:56.653 [2024-07-15 09:27:05.481173] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:56.653 [2024-07-15 09:27:05.481210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.586 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.843 "name": "raid_bdev1", 00:22:57.843 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:57.843 "strip_size_kb": 0, 00:22:57.843 "state": "online", 00:22:57.843 "raid_level": "raid1", 00:22:57.843 "superblock": false, 00:22:57.843 "num_base_bdevs": 2, 00:22:57.843 "num_base_bdevs_discovered": 2, 00:22:57.843 "num_base_bdevs_operational": 2, 00:22:57.843 "base_bdevs_list": [ 00:22:57.843 { 00:22:57.843 "name": "spare", 00:22:57.843 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:57.843 "is_configured": true, 00:22:57.843 "data_offset": 0, 00:22:57.843 "data_size": 65536 00:22:57.843 }, 00:22:57.843 { 00:22:57.843 "name": "BaseBdev2", 00:22:57.843 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:57.843 "is_configured": true, 00:22:57.843 "data_offset": 0, 00:22:57.843 "data_size": 65536 00:22:57.843 } 00:22:57.843 ] 00:22:57.843 }' 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.843 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.100 "name": "raid_bdev1", 00:22:58.100 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:58.100 "strip_size_kb": 0, 00:22:58.100 "state": "online", 00:22:58.100 "raid_level": "raid1", 00:22:58.100 "superblock": false, 00:22:58.100 "num_base_bdevs": 2, 00:22:58.100 "num_base_bdevs_discovered": 2, 00:22:58.100 "num_base_bdevs_operational": 2, 00:22:58.100 "base_bdevs_list": [ 00:22:58.100 { 00:22:58.100 "name": "spare", 00:22:58.100 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:58.100 "is_configured": true, 00:22:58.100 "data_offset": 0, 00:22:58.100 "data_size": 65536 00:22:58.100 }, 00:22:58.100 { 00:22:58.100 "name": "BaseBdev2", 00:22:58.100 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:58.100 "is_configured": true, 00:22:58.100 "data_offset": 0, 00:22:58.100 "data_size": 65536 00:22:58.100 } 00:22:58.100 ] 00:22:58.100 }' 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.100 09:27:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.363 09:27:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.363 "name": "raid_bdev1", 00:22:58.363 "uuid": "019cef63-6118-4457-99c7-68ed321a2b48", 00:22:58.363 "strip_size_kb": 0, 00:22:58.363 "state": "online", 00:22:58.363 "raid_level": "raid1", 00:22:58.363 "superblock": false, 00:22:58.363 "num_base_bdevs": 2, 00:22:58.363 "num_base_bdevs_discovered": 2, 00:22:58.363 "num_base_bdevs_operational": 2, 00:22:58.363 "base_bdevs_list": [ 00:22:58.363 { 00:22:58.363 "name": "spare", 00:22:58.363 "uuid": "05a73384-7c1f-54be-9842-c3abf9f49064", 00:22:58.363 "is_configured": true, 00:22:58.363 "data_offset": 0, 00:22:58.363 "data_size": 65536 00:22:58.363 }, 00:22:58.363 { 00:22:58.363 "name": "BaseBdev2", 00:22:58.363 "uuid": "93b01e6b-1dab-5da1-8606-8d84fd7fc6b8", 00:22:58.363 "is_configured": true, 00:22:58.363 "data_offset": 0, 00:22:58.363 "data_size": 65536 00:22:58.363 } 00:22:58.363 ] 00:22:58.363 }' 00:22:58.363 09:27:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.363 09:27:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:58.929 09:27:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:59.187 [2024-07-15 09:27:08.057178] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:59.187 [2024-07-15 09:27:08.057209] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:59.187 [2024-07-15 09:27:08.057275] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:59.187 [2024-07-15 09:27:08.057333] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:59.187 [2024-07-15 09:27:08.057345] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd98070 name raid_bdev1, state offline 00:22:59.187 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.187 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:59.446 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:59.705 /dev/nbd0 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:59.705 1+0 records in 00:22:59.705 1+0 records out 00:22:59.705 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024865 s, 16.5 MB/s 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:59.705 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:59.964 /dev/nbd1 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:59.964 1+0 records in 00:22:59.964 1+0 records out 00:22:59.964 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375936 s, 10.9 MB/s 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:59.964 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.222 09:27:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.481 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 190760 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 190760 ']' 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 190760 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 190760 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 190760' 00:23:00.779 killing process with pid 190760 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 190760 00:23:00.779 Received shutdown signal, test time was about 60.000000 seconds 00:23:00.779 00:23:00.779 Latency(us) 00:23:00.779 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:00.779 =================================================================================================================== 00:23:00.779 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:00.779 [2024-07-15 09:27:09.589512] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:00.779 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 190760 00:23:00.779 [2024-07-15 09:27:09.616989] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:01.039 00:23:01.039 real 0m21.713s 00:23:01.039 user 0m29.203s 00:23:01.039 sys 0m4.886s 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.039 ************************************ 00:23:01.039 END TEST raid_rebuild_test 00:23:01.039 ************************************ 00:23:01.039 09:27:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:01.039 09:27:09 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:01.039 09:27:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:01.039 09:27:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:01.039 09:27:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:01.039 ************************************ 00:23:01.039 START TEST raid_rebuild_test_sb 00:23:01.039 ************************************ 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=194311 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 194311 /var/tmp/spdk-raid.sock 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 194311 ']' 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:01.039 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:01.039 09:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:01.039 [2024-07-15 09:27:09.965640] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:23:01.039 [2024-07-15 09:27:09.965685] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194311 ] 00:23:01.039 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:01.039 Zero copy mechanism will not be used. 00:23:01.298 [2024-07-15 09:27:10.077783] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:01.298 [2024-07-15 09:27:10.181890] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:01.298 [2024-07-15 09:27:10.250842] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:01.298 [2024-07-15 09:27:10.250879] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:02.234 09:27:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:02.234 09:27:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:02.234 09:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:02.234 09:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:02.234 BaseBdev1_malloc 00:23:02.234 09:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:02.493 [2024-07-15 09:27:11.388481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:02.493 [2024-07-15 09:27:11.388532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:02.493 [2024-07-15 09:27:11.388556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa96d40 00:23:02.493 [2024-07-15 09:27:11.388569] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:02.493 [2024-07-15 09:27:11.390477] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:02.493 [2024-07-15 09:27:11.390507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:02.493 BaseBdev1 00:23:02.493 09:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:02.493 09:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:02.752 BaseBdev2_malloc 00:23:02.752 09:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:03.011 [2024-07-15 09:27:11.887974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:03.011 [2024-07-15 09:27:11.888022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.011 [2024-07-15 09:27:11.888045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa97860 00:23:03.011 [2024-07-15 09:27:11.888058] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.011 [2024-07-15 09:27:11.889632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.011 [2024-07-15 09:27:11.889659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:03.011 BaseBdev2 00:23:03.011 09:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:03.269 spare_malloc 00:23:03.269 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:03.528 spare_delay 00:23:03.528 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:03.787 [2024-07-15 09:27:12.626537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:03.787 [2024-07-15 09:27:12.626584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.787 [2024-07-15 09:27:12.626604] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc45ec0 00:23:03.787 [2024-07-15 09:27:12.626617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.787 [2024-07-15 09:27:12.628197] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.787 [2024-07-15 09:27:12.628226] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:03.787 spare 00:23:03.787 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:04.046 [2024-07-15 09:27:12.875222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:04.046 [2024-07-15 09:27:12.876578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:04.046 [2024-07-15 09:27:12.876749] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc47070 00:23:04.046 [2024-07-15 09:27:12.876762] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:04.046 [2024-07-15 09:27:12.876979] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc40490 00:23:04.046 [2024-07-15 09:27:12.877131] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc47070 00:23:04.046 [2024-07-15 09:27:12.877142] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc47070 00:23:04.046 [2024-07-15 09:27:12.877246] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.046 09:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.305 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.305 "name": "raid_bdev1", 00:23:04.305 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:04.305 "strip_size_kb": 0, 00:23:04.305 "state": "online", 00:23:04.305 "raid_level": "raid1", 00:23:04.305 "superblock": true, 00:23:04.305 "num_base_bdevs": 2, 00:23:04.305 "num_base_bdevs_discovered": 2, 00:23:04.305 "num_base_bdevs_operational": 2, 00:23:04.305 "base_bdevs_list": [ 00:23:04.305 { 00:23:04.305 "name": "BaseBdev1", 00:23:04.305 "uuid": "fbdd6a77-a26c-562f-82eb-a740f68941e6", 00:23:04.305 "is_configured": true, 00:23:04.305 "data_offset": 2048, 00:23:04.305 "data_size": 63488 00:23:04.305 }, 00:23:04.305 { 00:23:04.305 "name": "BaseBdev2", 00:23:04.305 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:04.305 "is_configured": true, 00:23:04.306 "data_offset": 2048, 00:23:04.306 "data_size": 63488 00:23:04.306 } 00:23:04.306 ] 00:23:04.306 }' 00:23:04.306 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.306 09:27:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:04.873 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:04.873 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:05.131 [2024-07-15 09:27:13.966314] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:05.131 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:05.131 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.131 09:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:05.391 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:05.650 [2024-07-15 09:27:14.463437] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc40490 00:23:05.650 /dev/nbd0 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:05.650 1+0 records in 00:23:05.650 1+0 records out 00:23:05.650 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000416273 s, 9.8 MB/s 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:05.650 09:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:12.213 63488+0 records in 00:23:12.213 63488+0 records out 00:23:12.213 32505856 bytes (33 MB, 31 MiB) copied, 6.02704 s, 5.4 MB/s 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:12.213 [2024-07-15 09:27:20.823129] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:12.213 09:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:12.214 [2024-07-15 09:27:21.059795] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.214 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.471 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.471 "name": "raid_bdev1", 00:23:12.471 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:12.471 "strip_size_kb": 0, 00:23:12.471 "state": "online", 00:23:12.471 "raid_level": "raid1", 00:23:12.471 "superblock": true, 00:23:12.471 "num_base_bdevs": 2, 00:23:12.471 "num_base_bdevs_discovered": 1, 00:23:12.471 "num_base_bdevs_operational": 1, 00:23:12.471 "base_bdevs_list": [ 00:23:12.471 { 00:23:12.471 "name": null, 00:23:12.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.471 "is_configured": false, 00:23:12.471 "data_offset": 2048, 00:23:12.471 "data_size": 63488 00:23:12.471 }, 00:23:12.471 { 00:23:12.471 "name": "BaseBdev2", 00:23:12.471 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:12.471 "is_configured": true, 00:23:12.472 "data_offset": 2048, 00:23:12.472 "data_size": 63488 00:23:12.472 } 00:23:12.472 ] 00:23:12.472 }' 00:23:12.472 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.472 09:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:13.037 09:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:13.295 [2024-07-15 09:27:22.158729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.295 [2024-07-15 09:27:22.163654] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc47b30 00:23:13.295 [2024-07-15 09:27:22.165867] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:13.295 09:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.668 "name": "raid_bdev1", 00:23:14.668 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:14.668 "strip_size_kb": 0, 00:23:14.668 "state": "online", 00:23:14.668 "raid_level": "raid1", 00:23:14.668 "superblock": true, 00:23:14.668 "num_base_bdevs": 2, 00:23:14.668 "num_base_bdevs_discovered": 2, 00:23:14.668 "num_base_bdevs_operational": 2, 00:23:14.668 "process": { 00:23:14.668 "type": "rebuild", 00:23:14.668 "target": "spare", 00:23:14.668 "progress": { 00:23:14.668 "blocks": 24576, 00:23:14.668 "percent": 38 00:23:14.668 } 00:23:14.668 }, 00:23:14.668 "base_bdevs_list": [ 00:23:14.668 { 00:23:14.668 "name": "spare", 00:23:14.668 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:14.668 "is_configured": true, 00:23:14.668 "data_offset": 2048, 00:23:14.668 "data_size": 63488 00:23:14.668 }, 00:23:14.668 { 00:23:14.668 "name": "BaseBdev2", 00:23:14.668 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:14.668 "is_configured": true, 00:23:14.668 "data_offset": 2048, 00:23:14.668 "data_size": 63488 00:23:14.668 } 00:23:14.668 ] 00:23:14.668 }' 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.668 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:14.925 [2024-07-15 09:27:23.752517] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.925 [2024-07-15 09:27:23.778560] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:14.925 [2024-07-15 09:27:23.778607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.925 [2024-07-15 09:27:23.778622] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.925 [2024-07-15 09:27:23.778631] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.925 09:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.181 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.181 "name": "raid_bdev1", 00:23:15.181 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:15.181 "strip_size_kb": 0, 00:23:15.181 "state": "online", 00:23:15.181 "raid_level": "raid1", 00:23:15.181 "superblock": true, 00:23:15.181 "num_base_bdevs": 2, 00:23:15.181 "num_base_bdevs_discovered": 1, 00:23:15.181 "num_base_bdevs_operational": 1, 00:23:15.181 "base_bdevs_list": [ 00:23:15.181 { 00:23:15.181 "name": null, 00:23:15.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.181 "is_configured": false, 00:23:15.181 "data_offset": 2048, 00:23:15.181 "data_size": 63488 00:23:15.181 }, 00:23:15.181 { 00:23:15.181 "name": "BaseBdev2", 00:23:15.181 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:15.181 "is_configured": true, 00:23:15.181 "data_offset": 2048, 00:23:15.181 "data_size": 63488 00:23:15.181 } 00:23:15.181 ] 00:23:15.181 }' 00:23:15.181 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.182 09:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.746 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.002 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.002 "name": "raid_bdev1", 00:23:16.002 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:16.002 "strip_size_kb": 0, 00:23:16.002 "state": "online", 00:23:16.002 "raid_level": "raid1", 00:23:16.002 "superblock": true, 00:23:16.002 "num_base_bdevs": 2, 00:23:16.002 "num_base_bdevs_discovered": 1, 00:23:16.002 "num_base_bdevs_operational": 1, 00:23:16.002 "base_bdevs_list": [ 00:23:16.002 { 00:23:16.003 "name": null, 00:23:16.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.003 "is_configured": false, 00:23:16.003 "data_offset": 2048, 00:23:16.003 "data_size": 63488 00:23:16.003 }, 00:23:16.003 { 00:23:16.003 "name": "BaseBdev2", 00:23:16.003 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:16.003 "is_configured": true, 00:23:16.003 "data_offset": 2048, 00:23:16.003 "data_size": 63488 00:23:16.003 } 00:23:16.003 ] 00:23:16.003 }' 00:23:16.003 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.269 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:16.269 09:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.269 09:27:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:16.269 09:27:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:16.527 [2024-07-15 09:27:25.226896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:16.527 [2024-07-15 09:27:25.232574] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc46ce0 00:23:16.527 [2024-07-15 09:27:25.234092] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:16.527 09:27:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.462 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.721 "name": "raid_bdev1", 00:23:17.721 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:17.721 "strip_size_kb": 0, 00:23:17.721 "state": "online", 00:23:17.721 "raid_level": "raid1", 00:23:17.721 "superblock": true, 00:23:17.721 "num_base_bdevs": 2, 00:23:17.721 "num_base_bdevs_discovered": 2, 00:23:17.721 "num_base_bdevs_operational": 2, 00:23:17.721 "process": { 00:23:17.721 "type": "rebuild", 00:23:17.721 "target": "spare", 00:23:17.721 "progress": { 00:23:17.721 "blocks": 24576, 00:23:17.721 "percent": 38 00:23:17.721 } 00:23:17.721 }, 00:23:17.721 "base_bdevs_list": [ 00:23:17.721 { 00:23:17.721 "name": "spare", 00:23:17.721 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:17.721 "is_configured": true, 00:23:17.721 "data_offset": 2048, 00:23:17.721 "data_size": 63488 00:23:17.721 }, 00:23:17.721 { 00:23:17.721 "name": "BaseBdev2", 00:23:17.721 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:17.721 "is_configured": true, 00:23:17.721 "data_offset": 2048, 00:23:17.721 "data_size": 63488 00:23:17.721 } 00:23:17.721 ] 00:23:17.721 }' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:17.721 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=790 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.721 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.980 "name": "raid_bdev1", 00:23:17.980 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:17.980 "strip_size_kb": 0, 00:23:17.980 "state": "online", 00:23:17.980 "raid_level": "raid1", 00:23:17.980 "superblock": true, 00:23:17.980 "num_base_bdevs": 2, 00:23:17.980 "num_base_bdevs_discovered": 2, 00:23:17.980 "num_base_bdevs_operational": 2, 00:23:17.980 "process": { 00:23:17.980 "type": "rebuild", 00:23:17.980 "target": "spare", 00:23:17.980 "progress": { 00:23:17.980 "blocks": 30720, 00:23:17.980 "percent": 48 00:23:17.980 } 00:23:17.980 }, 00:23:17.980 "base_bdevs_list": [ 00:23:17.980 { 00:23:17.980 "name": "spare", 00:23:17.980 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:17.980 "is_configured": true, 00:23:17.980 "data_offset": 2048, 00:23:17.980 "data_size": 63488 00:23:17.980 }, 00:23:17.980 { 00:23:17.980 "name": "BaseBdev2", 00:23:17.980 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:17.980 "is_configured": true, 00:23:17.980 "data_offset": 2048, 00:23:17.980 "data_size": 63488 00:23:17.980 } 00:23:17.980 ] 00:23:17.980 }' 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.980 09:27:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.356 09:27:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.356 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.356 "name": "raid_bdev1", 00:23:19.356 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:19.356 "strip_size_kb": 0, 00:23:19.356 "state": "online", 00:23:19.356 "raid_level": "raid1", 00:23:19.356 "superblock": true, 00:23:19.356 "num_base_bdevs": 2, 00:23:19.356 "num_base_bdevs_discovered": 2, 00:23:19.356 "num_base_bdevs_operational": 2, 00:23:19.356 "process": { 00:23:19.356 "type": "rebuild", 00:23:19.356 "target": "spare", 00:23:19.356 "progress": { 00:23:19.356 "blocks": 57344, 00:23:19.356 "percent": 90 00:23:19.356 } 00:23:19.356 }, 00:23:19.356 "base_bdevs_list": [ 00:23:19.356 { 00:23:19.356 "name": "spare", 00:23:19.356 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:19.356 "is_configured": true, 00:23:19.356 "data_offset": 2048, 00:23:19.356 "data_size": 63488 00:23:19.356 }, 00:23:19.356 { 00:23:19.356 "name": "BaseBdev2", 00:23:19.356 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:19.356 "is_configured": true, 00:23:19.356 "data_offset": 2048, 00:23:19.356 "data_size": 63488 00:23:19.356 } 00:23:19.356 ] 00:23:19.356 }' 00:23:19.356 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.357 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:19.357 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.357 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:19.357 09:27:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:19.615 [2024-07-15 09:27:28.358170] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:19.615 [2024-07-15 09:27:28.358232] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:19.615 [2024-07-15 09:27:28.358310] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.551 "name": "raid_bdev1", 00:23:20.551 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:20.551 "strip_size_kb": 0, 00:23:20.551 "state": "online", 00:23:20.551 "raid_level": "raid1", 00:23:20.551 "superblock": true, 00:23:20.551 "num_base_bdevs": 2, 00:23:20.551 "num_base_bdevs_discovered": 2, 00:23:20.551 "num_base_bdevs_operational": 2, 00:23:20.551 "base_bdevs_list": [ 00:23:20.551 { 00:23:20.551 "name": "spare", 00:23:20.551 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:20.551 "is_configured": true, 00:23:20.551 "data_offset": 2048, 00:23:20.551 "data_size": 63488 00:23:20.551 }, 00:23:20.551 { 00:23:20.551 "name": "BaseBdev2", 00:23:20.551 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:20.551 "is_configured": true, 00:23:20.551 "data_offset": 2048, 00:23:20.551 "data_size": 63488 00:23:20.551 } 00:23:20.551 ] 00:23:20.551 }' 00:23:20.551 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.811 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.069 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.069 "name": "raid_bdev1", 00:23:21.069 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:21.069 "strip_size_kb": 0, 00:23:21.069 "state": "online", 00:23:21.069 "raid_level": "raid1", 00:23:21.069 "superblock": true, 00:23:21.069 "num_base_bdevs": 2, 00:23:21.069 "num_base_bdevs_discovered": 2, 00:23:21.069 "num_base_bdevs_operational": 2, 00:23:21.069 "base_bdevs_list": [ 00:23:21.069 { 00:23:21.069 "name": "spare", 00:23:21.069 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:21.069 "is_configured": true, 00:23:21.069 "data_offset": 2048, 00:23:21.069 "data_size": 63488 00:23:21.069 }, 00:23:21.069 { 00:23:21.069 "name": "BaseBdev2", 00:23:21.069 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:21.069 "is_configured": true, 00:23:21.069 "data_offset": 2048, 00:23:21.069 "data_size": 63488 00:23:21.069 } 00:23:21.069 ] 00:23:21.069 }' 00:23:21.069 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.070 09:27:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.329 09:27:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.329 "name": "raid_bdev1", 00:23:21.329 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:21.329 "strip_size_kb": 0, 00:23:21.329 "state": "online", 00:23:21.329 "raid_level": "raid1", 00:23:21.329 "superblock": true, 00:23:21.329 "num_base_bdevs": 2, 00:23:21.329 "num_base_bdevs_discovered": 2, 00:23:21.329 "num_base_bdevs_operational": 2, 00:23:21.329 "base_bdevs_list": [ 00:23:21.329 { 00:23:21.329 "name": "spare", 00:23:21.329 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:21.329 "is_configured": true, 00:23:21.329 "data_offset": 2048, 00:23:21.329 "data_size": 63488 00:23:21.329 }, 00:23:21.329 { 00:23:21.329 "name": "BaseBdev2", 00:23:21.329 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:21.329 "is_configured": true, 00:23:21.329 "data_offset": 2048, 00:23:21.329 "data_size": 63488 00:23:21.329 } 00:23:21.329 ] 00:23:21.329 }' 00:23:21.329 09:27:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.329 09:27:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:21.897 09:27:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:22.156 [2024-07-15 09:27:30.997775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:22.156 [2024-07-15 09:27:30.997803] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:22.156 [2024-07-15 09:27:30.997862] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:22.156 [2024-07-15 09:27:30.997917] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:22.156 [2024-07-15 09:27:30.997936] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc47070 name raid_bdev1, state offline 00:23:22.156 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.156 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:22.415 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:22.674 /dev/nbd0 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.674 1+0 records in 00:23:22.674 1+0 records out 00:23:22.674 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227369 s, 18.0 MB/s 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:22.674 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:22.934 /dev/nbd1 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.934 1+0 records in 00:23:22.934 1+0 records out 00:23:22.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319278 s, 12.8 MB/s 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:22.934 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:23.192 09:27:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:23.469 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:23.774 09:27:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:24.339 [2024-07-15 09:27:33.175016] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:24.339 [2024-07-15 09:27:33.175062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.339 [2024-07-15 09:27:33.175088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc41670 00:23:24.339 [2024-07-15 09:27:33.175101] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.339 [2024-07-15 09:27:33.176781] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.339 [2024-07-15 09:27:33.176821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:24.339 [2024-07-15 09:27:33.176905] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:24.339 [2024-07-15 09:27:33.176944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:24.339 [2024-07-15 09:27:33.177049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:24.339 spare 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.339 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.339 [2024-07-15 09:27:33.277360] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa95900 00:23:24.339 [2024-07-15 09:27:33.277376] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:24.339 [2024-07-15 09:27:33.277575] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc493d0 00:23:24.339 [2024-07-15 09:27:33.277720] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa95900 00:23:24.339 [2024-07-15 09:27:33.277730] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa95900 00:23:24.339 [2024-07-15 09:27:33.277833] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.640 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.640 "name": "raid_bdev1", 00:23:24.640 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:24.640 "strip_size_kb": 0, 00:23:24.640 "state": "online", 00:23:24.640 "raid_level": "raid1", 00:23:24.640 "superblock": true, 00:23:24.640 "num_base_bdevs": 2, 00:23:24.640 "num_base_bdevs_discovered": 2, 00:23:24.640 "num_base_bdevs_operational": 2, 00:23:24.640 "base_bdevs_list": [ 00:23:24.640 { 00:23:24.640 "name": "spare", 00:23:24.640 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:24.640 "is_configured": true, 00:23:24.640 "data_offset": 2048, 00:23:24.640 "data_size": 63488 00:23:24.640 }, 00:23:24.640 { 00:23:24.640 "name": "BaseBdev2", 00:23:24.640 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:24.640 "is_configured": true, 00:23:24.640 "data_offset": 2048, 00:23:24.640 "data_size": 63488 00:23:24.640 } 00:23:24.640 ] 00:23:24.640 }' 00:23:24.640 09:27:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.640 09:27:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.206 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:25.463 "name": "raid_bdev1", 00:23:25.463 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:25.463 "strip_size_kb": 0, 00:23:25.463 "state": "online", 00:23:25.463 "raid_level": "raid1", 00:23:25.463 "superblock": true, 00:23:25.463 "num_base_bdevs": 2, 00:23:25.463 "num_base_bdevs_discovered": 2, 00:23:25.463 "num_base_bdevs_operational": 2, 00:23:25.463 "base_bdevs_list": [ 00:23:25.463 { 00:23:25.463 "name": "spare", 00:23:25.463 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:25.463 "is_configured": true, 00:23:25.463 "data_offset": 2048, 00:23:25.463 "data_size": 63488 00:23:25.463 }, 00:23:25.463 { 00:23:25.463 "name": "BaseBdev2", 00:23:25.463 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:25.463 "is_configured": true, 00:23:25.463 "data_offset": 2048, 00:23:25.463 "data_size": 63488 00:23:25.463 } 00:23:25.463 ] 00:23:25.463 }' 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.463 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:25.721 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:25.721 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:25.979 [2024-07-15 09:27:34.735256] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.979 09:27:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.238 09:27:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.238 "name": "raid_bdev1", 00:23:26.238 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:26.238 "strip_size_kb": 0, 00:23:26.238 "state": "online", 00:23:26.238 "raid_level": "raid1", 00:23:26.238 "superblock": true, 00:23:26.238 "num_base_bdevs": 2, 00:23:26.238 "num_base_bdevs_discovered": 1, 00:23:26.238 "num_base_bdevs_operational": 1, 00:23:26.238 "base_bdevs_list": [ 00:23:26.238 { 00:23:26.238 "name": null, 00:23:26.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.238 "is_configured": false, 00:23:26.238 "data_offset": 2048, 00:23:26.238 "data_size": 63488 00:23:26.238 }, 00:23:26.238 { 00:23:26.238 "name": "BaseBdev2", 00:23:26.238 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:26.238 "is_configured": true, 00:23:26.238 "data_offset": 2048, 00:23:26.238 "data_size": 63488 00:23:26.238 } 00:23:26.238 ] 00:23:26.238 }' 00:23:26.238 09:27:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.238 09:27:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:26.804 09:27:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:27.063 [2024-07-15 09:27:35.842204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:27.063 [2024-07-15 09:27:35.842346] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:27.063 [2024-07-15 09:27:35.842362] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:27.063 [2024-07-15 09:27:35.842390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:27.063 [2024-07-15 09:27:35.847199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc46d00 00:23:27.063 [2024-07-15 09:27:35.849515] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:27.063 09:27:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.996 09:27:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.254 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:28.254 "name": "raid_bdev1", 00:23:28.254 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:28.254 "strip_size_kb": 0, 00:23:28.254 "state": "online", 00:23:28.254 "raid_level": "raid1", 00:23:28.254 "superblock": true, 00:23:28.254 "num_base_bdevs": 2, 00:23:28.254 "num_base_bdevs_discovered": 2, 00:23:28.254 "num_base_bdevs_operational": 2, 00:23:28.254 "process": { 00:23:28.254 "type": "rebuild", 00:23:28.254 "target": "spare", 00:23:28.254 "progress": { 00:23:28.254 "blocks": 24576, 00:23:28.254 "percent": 38 00:23:28.254 } 00:23:28.254 }, 00:23:28.254 "base_bdevs_list": [ 00:23:28.254 { 00:23:28.254 "name": "spare", 00:23:28.254 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:28.254 "is_configured": true, 00:23:28.254 "data_offset": 2048, 00:23:28.254 "data_size": 63488 00:23:28.254 }, 00:23:28.254 { 00:23:28.254 "name": "BaseBdev2", 00:23:28.254 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:28.254 "is_configured": true, 00:23:28.254 "data_offset": 2048, 00:23:28.254 "data_size": 63488 00:23:28.254 } 00:23:28.254 ] 00:23:28.254 }' 00:23:28.254 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:28.254 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:28.254 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:28.511 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:28.511 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:28.511 [2024-07-15 09:27:37.439845] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:28.511 [2024-07-15 09:27:37.462199] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:28.511 [2024-07-15 09:27:37.462245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:28.511 [2024-07-15 09:27:37.462260] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:28.511 [2024-07-15 09:27:37.462269] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.768 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.025 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:29.025 "name": "raid_bdev1", 00:23:29.025 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:29.025 "strip_size_kb": 0, 00:23:29.025 "state": "online", 00:23:29.025 "raid_level": "raid1", 00:23:29.025 "superblock": true, 00:23:29.025 "num_base_bdevs": 2, 00:23:29.025 "num_base_bdevs_discovered": 1, 00:23:29.025 "num_base_bdevs_operational": 1, 00:23:29.025 "base_bdevs_list": [ 00:23:29.025 { 00:23:29.025 "name": null, 00:23:29.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.025 "is_configured": false, 00:23:29.025 "data_offset": 2048, 00:23:29.025 "data_size": 63488 00:23:29.025 }, 00:23:29.025 { 00:23:29.025 "name": "BaseBdev2", 00:23:29.025 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:29.025 "is_configured": true, 00:23:29.025 "data_offset": 2048, 00:23:29.025 "data_size": 63488 00:23:29.025 } 00:23:29.025 ] 00:23:29.025 }' 00:23:29.025 09:27:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:29.025 09:27:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:29.590 09:27:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:29.590 [2024-07-15 09:27:38.469796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:29.590 [2024-07-15 09:27:38.469841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.590 [2024-07-15 09:27:38.469863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa8dc10 00:23:29.590 [2024-07-15 09:27:38.469877] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.590 [2024-07-15 09:27:38.470241] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.590 [2024-07-15 09:27:38.470258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:29.590 [2024-07-15 09:27:38.470336] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:29.590 [2024-07-15 09:27:38.470348] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:29.590 [2024-07-15 09:27:38.470359] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:29.590 [2024-07-15 09:27:38.470377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.590 [2024-07-15 09:27:38.475189] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc46ce0 00:23:29.590 spare 00:23:29.590 [2024-07-15 09:27:38.476645] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:29.590 09:27:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.963 "name": "raid_bdev1", 00:23:30.963 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:30.963 "strip_size_kb": 0, 00:23:30.963 "state": "online", 00:23:30.963 "raid_level": "raid1", 00:23:30.963 "superblock": true, 00:23:30.963 "num_base_bdevs": 2, 00:23:30.963 "num_base_bdevs_discovered": 2, 00:23:30.963 "num_base_bdevs_operational": 2, 00:23:30.963 "process": { 00:23:30.963 "type": "rebuild", 00:23:30.963 "target": "spare", 00:23:30.963 "progress": { 00:23:30.963 "blocks": 22528, 00:23:30.963 "percent": 35 00:23:30.963 } 00:23:30.963 }, 00:23:30.963 "base_bdevs_list": [ 00:23:30.963 { 00:23:30.963 "name": "spare", 00:23:30.963 "uuid": "0eaa439c-c285-55be-9407-7b2b941c7313", 00:23:30.963 "is_configured": true, 00:23:30.963 "data_offset": 2048, 00:23:30.963 "data_size": 63488 00:23:30.963 }, 00:23:30.963 { 00:23:30.963 "name": "BaseBdev2", 00:23:30.963 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:30.963 "is_configured": true, 00:23:30.963 "data_offset": 2048, 00:23:30.963 "data_size": 63488 00:23:30.963 } 00:23:30.963 ] 00:23:30.963 }' 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:30.963 09:27:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:31.220 [2024-07-15 09:27:39.995675] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.221 [2024-07-15 09:27:40.089199] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:31.221 [2024-07-15 09:27:40.089250] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:31.221 [2024-07-15 09:27:40.089265] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.221 [2024-07-15 09:27:40.089274] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.221 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.478 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.478 "name": "raid_bdev1", 00:23:31.478 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:31.478 "strip_size_kb": 0, 00:23:31.478 "state": "online", 00:23:31.478 "raid_level": "raid1", 00:23:31.478 "superblock": true, 00:23:31.478 "num_base_bdevs": 2, 00:23:31.478 "num_base_bdevs_discovered": 1, 00:23:31.478 "num_base_bdevs_operational": 1, 00:23:31.478 "base_bdevs_list": [ 00:23:31.478 { 00:23:31.478 "name": null, 00:23:31.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.478 "is_configured": false, 00:23:31.478 "data_offset": 2048, 00:23:31.478 "data_size": 63488 00:23:31.478 }, 00:23:31.478 { 00:23:31.478 "name": "BaseBdev2", 00:23:31.478 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:31.478 "is_configured": true, 00:23:31.478 "data_offset": 2048, 00:23:31.478 "data_size": 63488 00:23:31.478 } 00:23:31.478 ] 00:23:31.478 }' 00:23:31.478 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.478 09:27:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.043 09:27:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.300 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.300 "name": "raid_bdev1", 00:23:32.300 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:32.300 "strip_size_kb": 0, 00:23:32.300 "state": "online", 00:23:32.300 "raid_level": "raid1", 00:23:32.300 "superblock": true, 00:23:32.300 "num_base_bdevs": 2, 00:23:32.300 "num_base_bdevs_discovered": 1, 00:23:32.300 "num_base_bdevs_operational": 1, 00:23:32.300 "base_bdevs_list": [ 00:23:32.300 { 00:23:32.300 "name": null, 00:23:32.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.301 "is_configured": false, 00:23:32.301 "data_offset": 2048, 00:23:32.301 "data_size": 63488 00:23:32.301 }, 00:23:32.301 { 00:23:32.301 "name": "BaseBdev2", 00:23:32.301 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:32.301 "is_configured": true, 00:23:32.301 "data_offset": 2048, 00:23:32.301 "data_size": 63488 00:23:32.301 } 00:23:32.301 ] 00:23:32.301 }' 00:23:32.301 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.301 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:32.301 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.558 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:32.558 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:32.815 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:33.072 [2024-07-15 09:27:41.769980] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:33.072 [2024-07-15 09:27:41.770022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:33.072 [2024-07-15 09:27:41.770042] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc45260 00:23:33.072 [2024-07-15 09:27:41.770054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:33.072 [2024-07-15 09:27:41.770385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:33.072 [2024-07-15 09:27:41.770403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:33.072 [2024-07-15 09:27:41.770464] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:33.072 [2024-07-15 09:27:41.770476] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:33.072 [2024-07-15 09:27:41.770487] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:33.072 BaseBdev1 00:23:33.072 09:27:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.006 09:27:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.264 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.264 "name": "raid_bdev1", 00:23:34.264 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:34.264 "strip_size_kb": 0, 00:23:34.264 "state": "online", 00:23:34.264 "raid_level": "raid1", 00:23:34.264 "superblock": true, 00:23:34.264 "num_base_bdevs": 2, 00:23:34.264 "num_base_bdevs_discovered": 1, 00:23:34.264 "num_base_bdevs_operational": 1, 00:23:34.264 "base_bdevs_list": [ 00:23:34.264 { 00:23:34.264 "name": null, 00:23:34.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.264 "is_configured": false, 00:23:34.264 "data_offset": 2048, 00:23:34.264 "data_size": 63488 00:23:34.264 }, 00:23:34.264 { 00:23:34.264 "name": "BaseBdev2", 00:23:34.264 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:34.264 "is_configured": true, 00:23:34.264 "data_offset": 2048, 00:23:34.264 "data_size": 63488 00:23:34.264 } 00:23:34.264 ] 00:23:34.264 }' 00:23:34.264 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.264 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.829 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.088 "name": "raid_bdev1", 00:23:35.088 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:35.088 "strip_size_kb": 0, 00:23:35.088 "state": "online", 00:23:35.088 "raid_level": "raid1", 00:23:35.088 "superblock": true, 00:23:35.088 "num_base_bdevs": 2, 00:23:35.088 "num_base_bdevs_discovered": 1, 00:23:35.088 "num_base_bdevs_operational": 1, 00:23:35.088 "base_bdevs_list": [ 00:23:35.088 { 00:23:35.088 "name": null, 00:23:35.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.088 "is_configured": false, 00:23:35.088 "data_offset": 2048, 00:23:35.088 "data_size": 63488 00:23:35.088 }, 00:23:35.088 { 00:23:35.088 "name": "BaseBdev2", 00:23:35.088 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:35.088 "is_configured": true, 00:23:35.088 "data_offset": 2048, 00:23:35.088 "data_size": 63488 00:23:35.088 } 00:23:35.088 ] 00:23:35.088 }' 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:35.088 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:35.089 09:27:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:35.347 [2024-07-15 09:27:44.216482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:35.347 [2024-07-15 09:27:44.216598] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:35.347 [2024-07-15 09:27:44.216619] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:35.347 request: 00:23:35.347 { 00:23:35.347 "base_bdev": "BaseBdev1", 00:23:35.347 "raid_bdev": "raid_bdev1", 00:23:35.347 "method": "bdev_raid_add_base_bdev", 00:23:35.347 "req_id": 1 00:23:35.347 } 00:23:35.347 Got JSON-RPC error response 00:23:35.347 response: 00:23:35.347 { 00:23:35.347 "code": -22, 00:23:35.347 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:35.347 } 00:23:35.347 09:27:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:35.347 09:27:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:35.347 09:27:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:35.347 09:27:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:35.347 09:27:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.722 "name": "raid_bdev1", 00:23:36.722 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:36.722 "strip_size_kb": 0, 00:23:36.722 "state": "online", 00:23:36.722 "raid_level": "raid1", 00:23:36.722 "superblock": true, 00:23:36.722 "num_base_bdevs": 2, 00:23:36.722 "num_base_bdevs_discovered": 1, 00:23:36.722 "num_base_bdevs_operational": 1, 00:23:36.722 "base_bdevs_list": [ 00:23:36.722 { 00:23:36.722 "name": null, 00:23:36.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.722 "is_configured": false, 00:23:36.722 "data_offset": 2048, 00:23:36.722 "data_size": 63488 00:23:36.722 }, 00:23:36.722 { 00:23:36.722 "name": "BaseBdev2", 00:23:36.722 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:36.722 "is_configured": true, 00:23:36.722 "data_offset": 2048, 00:23:36.722 "data_size": 63488 00:23:36.722 } 00:23:36.722 ] 00:23:36.722 }' 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.722 09:27:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.290 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.549 "name": "raid_bdev1", 00:23:37.549 "uuid": "26ee6a07-54d1-412b-abde-9e052cad31fc", 00:23:37.549 "strip_size_kb": 0, 00:23:37.549 "state": "online", 00:23:37.549 "raid_level": "raid1", 00:23:37.549 "superblock": true, 00:23:37.549 "num_base_bdevs": 2, 00:23:37.549 "num_base_bdevs_discovered": 1, 00:23:37.549 "num_base_bdevs_operational": 1, 00:23:37.549 "base_bdevs_list": [ 00:23:37.549 { 00:23:37.549 "name": null, 00:23:37.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.549 "is_configured": false, 00:23:37.549 "data_offset": 2048, 00:23:37.549 "data_size": 63488 00:23:37.549 }, 00:23:37.549 { 00:23:37.549 "name": "BaseBdev2", 00:23:37.549 "uuid": "801078ca-f63f-557e-9821-a4325248d4ac", 00:23:37.549 "is_configured": true, 00:23:37.549 "data_offset": 2048, 00:23:37.549 "data_size": 63488 00:23:37.549 } 00:23:37.549 ] 00:23:37.549 }' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 194311 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 194311 ']' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 194311 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 194311 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 194311' 00:23:37.549 killing process with pid 194311 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 194311 00:23:37.549 Received shutdown signal, test time was about 60.000000 seconds 00:23:37.549 00:23:37.549 Latency(us) 00:23:37.549 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:37.549 =================================================================================================================== 00:23:37.549 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:37.549 [2024-07-15 09:27:46.450144] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:37.549 [2024-07-15 09:27:46.450231] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:37.549 [2024-07-15 09:27:46.450271] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:37.549 [2024-07-15 09:27:46.450283] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa95900 name raid_bdev1, state offline 00:23:37.549 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 194311 00:23:37.549 [2024-07-15 09:27:46.476394] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:37.808 09:27:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:37.808 00:23:37.808 real 0m36.760s 00:23:37.808 user 0m52.545s 00:23:37.808 sys 0m7.269s 00:23:37.808 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:37.808 09:27:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:37.808 ************************************ 00:23:37.808 END TEST raid_rebuild_test_sb 00:23:37.808 ************************************ 00:23:37.808 09:27:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:37.808 09:27:46 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:37.808 09:27:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:37.808 09:27:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:37.808 09:27:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:38.067 ************************************ 00:23:38.067 START TEST raid_rebuild_test_io 00:23:38.067 ************************************ 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=199502 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 199502 /var/tmp/spdk-raid.sock 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 199502 ']' 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:38.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:38.067 09:27:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:38.067 [2024-07-15 09:27:46.833499] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:23:38.067 [2024-07-15 09:27:46.833569] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199502 ] 00:23:38.067 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:38.067 Zero copy mechanism will not be used. 00:23:38.067 [2024-07-15 09:27:46.961508] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:38.325 [2024-07-15 09:27:47.063129] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:38.325 [2024-07-15 09:27:47.128778] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:38.325 [2024-07-15 09:27:47.128822] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:38.892 09:27:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:38.892 09:27:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:38.892 09:27:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:38.892 09:27:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:38.892 BaseBdev1_malloc 00:23:38.892 09:27:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:39.151 [2024-07-15 09:27:47.978341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:39.151 [2024-07-15 09:27:47.978390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.151 [2024-07-15 09:27:47.978413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa91d40 00:23:39.151 [2024-07-15 09:27:47.978425] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.151 [2024-07-15 09:27:47.980141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.151 [2024-07-15 09:27:47.980170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:39.151 BaseBdev1 00:23:39.151 09:27:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:39.151 09:27:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:39.410 BaseBdev2_malloc 00:23:39.410 09:27:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:39.668 [2024-07-15 09:27:48.380248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:39.668 [2024-07-15 09:27:48.380293] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.668 [2024-07-15 09:27:48.380315] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa92860 00:23:39.668 [2024-07-15 09:27:48.380328] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.668 [2024-07-15 09:27:48.381871] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.668 [2024-07-15 09:27:48.381898] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:39.668 BaseBdev2 00:23:39.668 09:27:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:39.668 spare_malloc 00:23:39.668 09:27:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:39.927 spare_delay 00:23:39.927 09:27:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:40.187 [2024-07-15 09:27:49.034565] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:40.187 [2024-07-15 09:27:49.034613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:40.187 [2024-07-15 09:27:49.034634] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc40ec0 00:23:40.187 [2024-07-15 09:27:49.034646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:40.187 [2024-07-15 09:27:49.036269] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:40.187 [2024-07-15 09:27:49.036299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:40.187 spare 00:23:40.187 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:40.486 [2024-07-15 09:27:49.199023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:40.486 [2024-07-15 09:27:49.200366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:40.486 [2024-07-15 09:27:49.200444] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc42070 00:23:40.486 [2024-07-15 09:27:49.200456] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:40.486 [2024-07-15 09:27:49.200668] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc3b490 00:23:40.486 [2024-07-15 09:27:49.200810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc42070 00:23:40.486 [2024-07-15 09:27:49.200820] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc42070 00:23:40.486 [2024-07-15 09:27:49.200943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.486 "name": "raid_bdev1", 00:23:40.486 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:40.486 "strip_size_kb": 0, 00:23:40.486 "state": "online", 00:23:40.486 "raid_level": "raid1", 00:23:40.486 "superblock": false, 00:23:40.486 "num_base_bdevs": 2, 00:23:40.486 "num_base_bdevs_discovered": 2, 00:23:40.486 "num_base_bdevs_operational": 2, 00:23:40.486 "base_bdevs_list": [ 00:23:40.486 { 00:23:40.486 "name": "BaseBdev1", 00:23:40.486 "uuid": "303bff80-e54b-50be-9a11-1bbd518262d6", 00:23:40.486 "is_configured": true, 00:23:40.486 "data_offset": 0, 00:23:40.486 "data_size": 65536 00:23:40.486 }, 00:23:40.486 { 00:23:40.486 "name": "BaseBdev2", 00:23:40.486 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:40.486 "is_configured": true, 00:23:40.486 "data_offset": 0, 00:23:40.486 "data_size": 65536 00:23:40.486 } 00:23:40.486 ] 00:23:40.486 }' 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.486 09:27:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:41.079 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:41.079 09:27:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:41.338 [2024-07-15 09:27:50.161856] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:41.338 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:41.338 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.338 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:41.596 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:41.596 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:41.596 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:41.596 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:41.596 [2024-07-15 09:27:50.532640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc3cbd0 00:23:41.596 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:41.596 Zero copy mechanism will not be used. 00:23:41.596 Running I/O for 60 seconds... 00:23:41.855 [2024-07-15 09:27:50.649204] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:41.855 [2024-07-15 09:27:50.657347] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc3cbd0 00:23:41.855 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:41.855 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:41.855 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:41.855 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:41.855 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.856 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.114 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.114 "name": "raid_bdev1", 00:23:42.114 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:42.115 "strip_size_kb": 0, 00:23:42.115 "state": "online", 00:23:42.115 "raid_level": "raid1", 00:23:42.115 "superblock": false, 00:23:42.115 "num_base_bdevs": 2, 00:23:42.115 "num_base_bdevs_discovered": 1, 00:23:42.115 "num_base_bdevs_operational": 1, 00:23:42.115 "base_bdevs_list": [ 00:23:42.115 { 00:23:42.115 "name": null, 00:23:42.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.115 "is_configured": false, 00:23:42.115 "data_offset": 0, 00:23:42.115 "data_size": 65536 00:23:42.115 }, 00:23:42.115 { 00:23:42.115 "name": "BaseBdev2", 00:23:42.115 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:42.115 "is_configured": true, 00:23:42.115 "data_offset": 0, 00:23:42.115 "data_size": 65536 00:23:42.115 } 00:23:42.115 ] 00:23:42.115 }' 00:23:42.115 09:27:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.115 09:27:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:42.681 09:27:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:42.940 [2024-07-15 09:27:51.683845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:42.940 09:27:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:42.940 [2024-07-15 09:27:51.758810] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbc48b0 00:23:42.940 [2024-07-15 09:27:51.761170] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:42.940 [2024-07-15 09:27:51.863686] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:42.940 [2024-07-15 09:27:51.864004] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:43.198 [2024-07-15 09:27:52.000127] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:43.198 [2024-07-15 09:27:52.000379] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:43.456 [2024-07-15 09:27:52.405175] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:43.456 [2024-07-15 09:27:52.405586] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:43.714 [2024-07-15 09:27:52.628744] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.973 09:27:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.232 [2024-07-15 09:27:52.977221] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:44.232 "name": "raid_bdev1", 00:23:44.232 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:44.232 "strip_size_kb": 0, 00:23:44.232 "state": "online", 00:23:44.232 "raid_level": "raid1", 00:23:44.232 "superblock": false, 00:23:44.232 "num_base_bdevs": 2, 00:23:44.232 "num_base_bdevs_discovered": 2, 00:23:44.232 "num_base_bdevs_operational": 2, 00:23:44.232 "process": { 00:23:44.232 "type": "rebuild", 00:23:44.232 "target": "spare", 00:23:44.232 "progress": { 00:23:44.232 "blocks": 12288, 00:23:44.232 "percent": 18 00:23:44.232 } 00:23:44.232 }, 00:23:44.232 "base_bdevs_list": [ 00:23:44.232 { 00:23:44.232 "name": "spare", 00:23:44.232 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:44.232 "is_configured": true, 00:23:44.232 "data_offset": 0, 00:23:44.232 "data_size": 65536 00:23:44.232 }, 00:23:44.232 { 00:23:44.232 "name": "BaseBdev2", 00:23:44.232 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:44.232 "is_configured": true, 00:23:44.232 "data_offset": 0, 00:23:44.232 "data_size": 65536 00:23:44.232 } 00:23:44.232 ] 00:23:44.232 }' 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:44.232 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:44.490 [2024-07-15 09:27:53.205698] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:44.490 [2024-07-15 09:27:53.377428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:44.490 [2024-07-15 09:27:53.432054] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:44.749 [2024-07-15 09:27:53.539951] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:44.749 [2024-07-15 09:27:53.541548] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:44.749 [2024-07-15 09:27:53.541574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:44.749 [2024-07-15 09:27:53.541584] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:44.749 [2024-07-15 09:27:53.564226] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc3cbd0 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.749 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.008 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.008 "name": "raid_bdev1", 00:23:45.008 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:45.008 "strip_size_kb": 0, 00:23:45.008 "state": "online", 00:23:45.008 "raid_level": "raid1", 00:23:45.008 "superblock": false, 00:23:45.008 "num_base_bdevs": 2, 00:23:45.008 "num_base_bdevs_discovered": 1, 00:23:45.008 "num_base_bdevs_operational": 1, 00:23:45.008 "base_bdevs_list": [ 00:23:45.008 { 00:23:45.008 "name": null, 00:23:45.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.008 "is_configured": false, 00:23:45.008 "data_offset": 0, 00:23:45.008 "data_size": 65536 00:23:45.008 }, 00:23:45.008 { 00:23:45.008 "name": "BaseBdev2", 00:23:45.008 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:45.008 "is_configured": true, 00:23:45.008 "data_offset": 0, 00:23:45.008 "data_size": 65536 00:23:45.008 } 00:23:45.008 ] 00:23:45.008 }' 00:23:45.008 09:27:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.008 09:27:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.943 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.202 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:46.202 "name": "raid_bdev1", 00:23:46.202 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:46.202 "strip_size_kb": 0, 00:23:46.202 "state": "online", 00:23:46.202 "raid_level": "raid1", 00:23:46.202 "superblock": false, 00:23:46.202 "num_base_bdevs": 2, 00:23:46.202 "num_base_bdevs_discovered": 1, 00:23:46.203 "num_base_bdevs_operational": 1, 00:23:46.203 "base_bdevs_list": [ 00:23:46.203 { 00:23:46.203 "name": null, 00:23:46.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.203 "is_configured": false, 00:23:46.203 "data_offset": 0, 00:23:46.203 "data_size": 65536 00:23:46.203 }, 00:23:46.203 { 00:23:46.203 "name": "BaseBdev2", 00:23:46.203 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:46.203 "is_configured": true, 00:23:46.203 "data_offset": 0, 00:23:46.203 "data_size": 65536 00:23:46.203 } 00:23:46.203 ] 00:23:46.203 }' 00:23:46.203 09:27:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:46.203 09:27:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:46.203 09:27:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:46.203 09:27:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:46.203 09:27:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.461 [2024-07-15 09:27:55.318430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.461 09:27:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:46.461 [2024-07-15 09:27:55.387163] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc42450 00:23:46.461 [2024-07-15 09:27:55.388668] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.719 [2024-07-15 09:27:55.507543] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:46.720 [2024-07-15 09:27:55.507993] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:46.977 [2024-07-15 09:27:55.735592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:46.977 [2024-07-15 09:27:55.735777] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:47.236 [2024-07-15 09:27:56.095900] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:47.495 [2024-07-15 09:27:56.224841] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.495 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.754 [2024-07-15 09:27:56.564636] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:47.754 [2024-07-15 09:27:56.565040] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.754 "name": "raid_bdev1", 00:23:47.754 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:47.754 "strip_size_kb": 0, 00:23:47.754 "state": "online", 00:23:47.754 "raid_level": "raid1", 00:23:47.754 "superblock": false, 00:23:47.754 "num_base_bdevs": 2, 00:23:47.754 "num_base_bdevs_discovered": 2, 00:23:47.754 "num_base_bdevs_operational": 2, 00:23:47.754 "process": { 00:23:47.754 "type": "rebuild", 00:23:47.754 "target": "spare", 00:23:47.754 "progress": { 00:23:47.754 "blocks": 14336, 00:23:47.754 "percent": 21 00:23:47.754 } 00:23:47.754 }, 00:23:47.754 "base_bdevs_list": [ 00:23:47.754 { 00:23:47.754 "name": "spare", 00:23:47.754 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:47.754 "is_configured": true, 00:23:47.754 "data_offset": 0, 00:23:47.754 "data_size": 65536 00:23:47.754 }, 00:23:47.754 { 00:23:47.754 "name": "BaseBdev2", 00:23:47.754 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:47.754 "is_configured": true, 00:23:47.754 "data_offset": 0, 00:23:47.754 "data_size": 65536 00:23:47.754 } 00:23:47.754 ] 00:23:47.754 }' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=820 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.754 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.013 [2024-07-15 09:27:56.802247] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:48.013 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.013 "name": "raid_bdev1", 00:23:48.013 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:48.013 "strip_size_kb": 0, 00:23:48.013 "state": "online", 00:23:48.013 "raid_level": "raid1", 00:23:48.013 "superblock": false, 00:23:48.013 "num_base_bdevs": 2, 00:23:48.013 "num_base_bdevs_discovered": 2, 00:23:48.013 "num_base_bdevs_operational": 2, 00:23:48.013 "process": { 00:23:48.013 "type": "rebuild", 00:23:48.013 "target": "spare", 00:23:48.013 "progress": { 00:23:48.013 "blocks": 16384, 00:23:48.013 "percent": 25 00:23:48.013 } 00:23:48.013 }, 00:23:48.013 "base_bdevs_list": [ 00:23:48.013 { 00:23:48.013 "name": "spare", 00:23:48.013 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:48.013 "is_configured": true, 00:23:48.013 "data_offset": 0, 00:23:48.013 "data_size": 65536 00:23:48.013 }, 00:23:48.013 { 00:23:48.013 "name": "BaseBdev2", 00:23:48.013 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:48.013 "is_configured": true, 00:23:48.013 "data_offset": 0, 00:23:48.013 "data_size": 65536 00:23:48.013 } 00:23:48.013 ] 00:23:48.013 }' 00:23:48.013 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.272 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:48.272 09:27:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.272 09:27:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:48.272 09:27:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:48.272 [2024-07-15 09:27:57.067254] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:48.841 [2024-07-15 09:27:57.658478] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:48.841 [2024-07-15 09:27:57.658661] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.100 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.359 [2024-07-15 09:27:58.080918] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:49.359 [2024-07-15 09:27:58.081168] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:49.359 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.359 "name": "raid_bdev1", 00:23:49.359 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:49.359 "strip_size_kb": 0, 00:23:49.359 "state": "online", 00:23:49.359 "raid_level": "raid1", 00:23:49.360 "superblock": false, 00:23:49.360 "num_base_bdevs": 2, 00:23:49.360 "num_base_bdevs_discovered": 2, 00:23:49.360 "num_base_bdevs_operational": 2, 00:23:49.360 "process": { 00:23:49.360 "type": "rebuild", 00:23:49.360 "target": "spare", 00:23:49.360 "progress": { 00:23:49.360 "blocks": 34816, 00:23:49.360 "percent": 53 00:23:49.360 } 00:23:49.360 }, 00:23:49.360 "base_bdevs_list": [ 00:23:49.360 { 00:23:49.360 "name": "spare", 00:23:49.360 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:49.360 "is_configured": true, 00:23:49.360 "data_offset": 0, 00:23:49.360 "data_size": 65536 00:23:49.360 }, 00:23:49.360 { 00:23:49.360 "name": "BaseBdev2", 00:23:49.360 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:49.360 "is_configured": true, 00:23:49.360 "data_offset": 0, 00:23:49.360 "data_size": 65536 00:23:49.360 } 00:23:49.360 ] 00:23:49.360 }' 00:23:49.360 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.619 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:49.619 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.619 [2024-07-15 09:27:58.404768] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:49.619 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:49.619 09:27:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:49.619 [2024-07-15 09:27:58.514776] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:50.186 [2024-07-15 09:27:59.070890] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.755 [2024-07-15 09:27:59.509838] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.755 "name": "raid_bdev1", 00:23:50.755 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:50.755 "strip_size_kb": 0, 00:23:50.755 "state": "online", 00:23:50.755 "raid_level": "raid1", 00:23:50.755 "superblock": false, 00:23:50.755 "num_base_bdevs": 2, 00:23:50.755 "num_base_bdevs_discovered": 2, 00:23:50.755 "num_base_bdevs_operational": 2, 00:23:50.755 "process": { 00:23:50.755 "type": "rebuild", 00:23:50.755 "target": "spare", 00:23:50.755 "progress": { 00:23:50.755 "blocks": 59392, 00:23:50.755 "percent": 90 00:23:50.755 } 00:23:50.755 }, 00:23:50.755 "base_bdevs_list": [ 00:23:50.755 { 00:23:50.755 "name": "spare", 00:23:50.755 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:50.755 "is_configured": true, 00:23:50.755 "data_offset": 0, 00:23:50.755 "data_size": 65536 00:23:50.755 }, 00:23:50.755 { 00:23:50.755 "name": "BaseBdev2", 00:23:50.755 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:50.755 "is_configured": true, 00:23:50.755 "data_offset": 0, 00:23:50.755 "data_size": 65536 00:23:50.755 } 00:23:50.755 ] 00:23:50.755 }' 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.755 09:27:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:51.035 [2024-07-15 09:27:59.948016] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:51.295 [2024-07-15 09:28:00.048243] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:51.295 [2024-07-15 09:28:00.049755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.862 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.120 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.120 "name": "raid_bdev1", 00:23:52.120 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:52.120 "strip_size_kb": 0, 00:23:52.120 "state": "online", 00:23:52.120 "raid_level": "raid1", 00:23:52.120 "superblock": false, 00:23:52.120 "num_base_bdevs": 2, 00:23:52.120 "num_base_bdevs_discovered": 2, 00:23:52.120 "num_base_bdevs_operational": 2, 00:23:52.120 "base_bdevs_list": [ 00:23:52.120 { 00:23:52.120 "name": "spare", 00:23:52.120 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:52.120 "is_configured": true, 00:23:52.120 "data_offset": 0, 00:23:52.120 "data_size": 65536 00:23:52.120 }, 00:23:52.120 { 00:23:52.120 "name": "BaseBdev2", 00:23:52.120 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:52.120 "is_configured": true, 00:23:52.120 "data_offset": 0, 00:23:52.120 "data_size": 65536 00:23:52.120 } 00:23:52.120 ] 00:23:52.120 }' 00:23:52.120 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.120 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:52.120 09:28:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.120 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.380 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.380 "name": "raid_bdev1", 00:23:52.380 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:52.380 "strip_size_kb": 0, 00:23:52.380 "state": "online", 00:23:52.380 "raid_level": "raid1", 00:23:52.380 "superblock": false, 00:23:52.380 "num_base_bdevs": 2, 00:23:52.380 "num_base_bdevs_discovered": 2, 00:23:52.380 "num_base_bdevs_operational": 2, 00:23:52.380 "base_bdevs_list": [ 00:23:52.380 { 00:23:52.380 "name": "spare", 00:23:52.380 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:52.380 "is_configured": true, 00:23:52.380 "data_offset": 0, 00:23:52.380 "data_size": 65536 00:23:52.380 }, 00:23:52.380 { 00:23:52.380 "name": "BaseBdev2", 00:23:52.380 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:52.380 "is_configured": true, 00:23:52.380 "data_offset": 0, 00:23:52.380 "data_size": 65536 00:23:52.380 } 00:23:52.380 ] 00:23:52.380 }' 00:23:52.380 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.380 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:52.380 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.639 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.640 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.899 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:52.899 "name": "raid_bdev1", 00:23:52.899 "uuid": "2ddbcc69-a1d1-4cff-bc3b-f79cd2fedf5c", 00:23:52.899 "strip_size_kb": 0, 00:23:52.899 "state": "online", 00:23:52.899 "raid_level": "raid1", 00:23:52.899 "superblock": false, 00:23:52.899 "num_base_bdevs": 2, 00:23:52.899 "num_base_bdevs_discovered": 2, 00:23:52.899 "num_base_bdevs_operational": 2, 00:23:52.899 "base_bdevs_list": [ 00:23:52.899 { 00:23:52.899 "name": "spare", 00:23:52.899 "uuid": "4ef6b346-091c-5f28-bc2b-4632b07b5ef5", 00:23:52.899 "is_configured": true, 00:23:52.899 "data_offset": 0, 00:23:52.899 "data_size": 65536 00:23:52.899 }, 00:23:52.899 { 00:23:52.899 "name": "BaseBdev2", 00:23:52.899 "uuid": "a382a16d-483e-5df1-bea0-c62b80fdd688", 00:23:52.899 "is_configured": true, 00:23:52.899 "data_offset": 0, 00:23:52.899 "data_size": 65536 00:23:52.899 } 00:23:52.899 ] 00:23:52.899 }' 00:23:52.899 09:28:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:52.899 09:28:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:53.468 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:53.468 [2024-07-15 09:28:02.382619] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:53.468 [2024-07-15 09:28:02.382652] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:53.727 00:23:53.727 Latency(us) 00:23:53.727 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:53.727 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:53.727 raid_bdev1 : 11.92 100.26 300.79 0.00 0.00 13165.08 297.41 118534.68 00:23:53.727 =================================================================================================================== 00:23:53.727 Total : 100.26 300.79 0.00 0.00 13165.08 297.41 118534.68 00:23:53.727 [2024-07-15 09:28:02.486847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:53.727 [2024-07-15 09:28:02.486876] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:53.727 [2024-07-15 09:28:02.486959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:53.727 [2024-07-15 09:28:02.486972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc42070 name raid_bdev1, state offline 00:23:53.727 0 00:23:53.727 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:53.727 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:53.987 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:53.987 /dev/nbd0 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:54.247 1+0 records in 00:23:54.247 1+0 records out 00:23:54.247 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298309 s, 13.7 MB/s 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:54.247 09:28:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:54.505 /dev/nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:54.505 1+0 records in 00:23:54.505 1+0 records out 00:23:54.505 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262584 s, 15.6 MB/s 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:54.505 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:54.764 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 199502 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 199502 ']' 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 199502 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 199502 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 199502' 00:23:55.023 killing process with pid 199502 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 199502 00:23:55.023 Received shutdown signal, test time was about 13.309560 seconds 00:23:55.023 00:23:55.023 Latency(us) 00:23:55.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:55.023 =================================================================================================================== 00:23:55.023 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:55.023 [2024-07-15 09:28:03.877060] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:55.023 09:28:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 199502 00:23:55.024 [2024-07-15 09:28:03.899058] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:55.283 00:23:55.283 real 0m17.369s 00:23:55.283 user 0m26.277s 00:23:55.283 sys 0m2.699s 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:55.283 ************************************ 00:23:55.283 END TEST raid_rebuild_test_io 00:23:55.283 ************************************ 00:23:55.283 09:28:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:55.283 09:28:04 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:55.283 09:28:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:55.283 09:28:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:55.283 09:28:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:55.283 ************************************ 00:23:55.283 START TEST raid_rebuild_test_sb_io 00:23:55.283 ************************************ 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=202016 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 202016 /var/tmp/spdk-raid.sock 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 202016 ']' 00:23:55.283 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:55.542 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:55.542 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:55.542 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:55.542 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:55.542 09:28:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:55.542 [2024-07-15 09:28:04.294740] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:23:55.542 [2024-07-15 09:28:04.294807] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202016 ] 00:23:55.542 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:55.542 Zero copy mechanism will not be used. 00:23:55.542 [2024-07-15 09:28:04.412286] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:55.801 [2024-07-15 09:28:04.515529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:55.801 [2024-07-15 09:28:04.575609] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:55.801 [2024-07-15 09:28:04.575656] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:56.369 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:56.369 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:56.369 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:56.369 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:56.629 BaseBdev1_malloc 00:23:56.629 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:56.887 [2024-07-15 09:28:05.692516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:56.887 [2024-07-15 09:28:05.692567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.887 [2024-07-15 09:28:05.692591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d2d40 00:23:56.887 [2024-07-15 09:28:05.692604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.887 [2024-07-15 09:28:05.694381] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.887 [2024-07-15 09:28:05.694410] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:56.887 BaseBdev1 00:23:56.887 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:56.887 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:57.146 BaseBdev2_malloc 00:23:57.146 09:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:57.429 [2024-07-15 09:28:06.179890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:57.429 [2024-07-15 09:28:06.179943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:57.429 [2024-07-15 09:28:06.179967] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d3860 00:23:57.429 [2024-07-15 09:28:06.179980] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:57.429 [2024-07-15 09:28:06.181518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:57.429 [2024-07-15 09:28:06.181546] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:57.429 BaseBdev2 00:23:57.429 09:28:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:57.730 spare_malloc 00:23:57.730 09:28:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:57.730 spare_delay 00:23:57.989 09:28:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:57.989 [2024-07-15 09:28:06.851503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:57.989 [2024-07-15 09:28:06.851545] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:57.989 [2024-07-15 09:28:06.851565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1881ec0 00:23:57.989 [2024-07-15 09:28:06.851577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:57.989 [2024-07-15 09:28:06.853054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:57.989 [2024-07-15 09:28:06.853081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:57.989 spare 00:23:57.989 09:28:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:58.248 [2024-07-15 09:28:07.032015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:58.248 [2024-07-15 09:28:07.033246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:58.248 [2024-07-15 09:28:07.033415] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1883070 00:23:58.248 [2024-07-15 09:28:07.033428] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:58.248 [2024-07-15 09:28:07.033610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x187c490 00:23:58.248 [2024-07-15 09:28:07.033748] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1883070 00:23:58.248 [2024-07-15 09:28:07.033758] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1883070 00:23:58.248 [2024-07-15 09:28:07.033853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.248 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.508 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.508 "name": "raid_bdev1", 00:23:58.508 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:23:58.508 "strip_size_kb": 0, 00:23:58.508 "state": "online", 00:23:58.508 "raid_level": "raid1", 00:23:58.508 "superblock": true, 00:23:58.508 "num_base_bdevs": 2, 00:23:58.508 "num_base_bdevs_discovered": 2, 00:23:58.508 "num_base_bdevs_operational": 2, 00:23:58.508 "base_bdevs_list": [ 00:23:58.508 { 00:23:58.508 "name": "BaseBdev1", 00:23:58.508 "uuid": "e2f6e3a9-5330-592f-8bc5-c1ca617771e2", 00:23:58.508 "is_configured": true, 00:23:58.508 "data_offset": 2048, 00:23:58.508 "data_size": 63488 00:23:58.508 }, 00:23:58.508 { 00:23:58.508 "name": "BaseBdev2", 00:23:58.508 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:23:58.508 "is_configured": true, 00:23:58.508 "data_offset": 2048, 00:23:58.508 "data_size": 63488 00:23:58.508 } 00:23:58.508 ] 00:23:58.508 }' 00:23:58.508 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.508 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:59.075 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:59.075 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:59.075 [2024-07-15 09:28:07.970717] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:59.075 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:59.075 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.075 09:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:59.337 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:59.337 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:59.337 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:59.337 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:59.597 [2024-07-15 09:28:08.353607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1883c50 00:23:59.597 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:59.597 Zero copy mechanism will not be used. 00:23:59.597 Running I/O for 60 seconds... 00:23:59.597 [2024-07-15 09:28:08.467581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:59.597 [2024-07-15 09:28:08.475740] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1883c50 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.597 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.856 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.856 "name": "raid_bdev1", 00:23:59.856 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:23:59.856 "strip_size_kb": 0, 00:23:59.856 "state": "online", 00:23:59.856 "raid_level": "raid1", 00:23:59.856 "superblock": true, 00:23:59.856 "num_base_bdevs": 2, 00:23:59.856 "num_base_bdevs_discovered": 1, 00:23:59.856 "num_base_bdevs_operational": 1, 00:23:59.856 "base_bdevs_list": [ 00:23:59.856 { 00:23:59.856 "name": null, 00:23:59.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.856 "is_configured": false, 00:23:59.856 "data_offset": 2048, 00:23:59.856 "data_size": 63488 00:23:59.856 }, 00:23:59.856 { 00:23:59.856 "name": "BaseBdev2", 00:23:59.856 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:23:59.856 "is_configured": true, 00:23:59.856 "data_offset": 2048, 00:23:59.856 "data_size": 63488 00:23:59.856 } 00:23:59.856 ] 00:23:59.856 }' 00:23:59.856 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.856 09:28:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:00.794 09:28:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:00.794 [2024-07-15 09:28:09.640938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:00.794 [2024-07-15 09:28:09.683499] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ef230 00:24:00.794 [2024-07-15 09:28:09.685855] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:00.794 09:28:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:01.053 [2024-07-15 09:28:09.813005] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:01.053 [2024-07-15 09:28:09.813353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:01.311 [2024-07-15 09:28:10.046340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:01.311 [2024-07-15 09:28:10.046548] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:01.568 [2024-07-15 09:28:10.293375] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:01.568 [2024-07-15 09:28:10.293684] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:01.568 [2024-07-15 09:28:10.504626] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:01.568 [2024-07-15 09:28:10.504902] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.826 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.826 [2024-07-15 09:28:10.745050] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:02.085 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:02.085 "name": "raid_bdev1", 00:24:02.085 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:02.085 "strip_size_kb": 0, 00:24:02.085 "state": "online", 00:24:02.085 "raid_level": "raid1", 00:24:02.085 "superblock": true, 00:24:02.085 "num_base_bdevs": 2, 00:24:02.085 "num_base_bdevs_discovered": 2, 00:24:02.085 "num_base_bdevs_operational": 2, 00:24:02.085 "process": { 00:24:02.085 "type": "rebuild", 00:24:02.085 "target": "spare", 00:24:02.085 "progress": { 00:24:02.085 "blocks": 14336, 00:24:02.085 "percent": 22 00:24:02.085 } 00:24:02.085 }, 00:24:02.085 "base_bdevs_list": [ 00:24:02.085 { 00:24:02.085 "name": "spare", 00:24:02.085 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:02.085 "is_configured": true, 00:24:02.085 "data_offset": 2048, 00:24:02.085 "data_size": 63488 00:24:02.085 }, 00:24:02.085 { 00:24:02.085 "name": "BaseBdev2", 00:24:02.085 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:02.085 "is_configured": true, 00:24:02.085 "data_offset": 2048, 00:24:02.085 "data_size": 63488 00:24:02.085 } 00:24:02.085 ] 00:24:02.085 }' 00:24:02.085 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:02.085 [2024-07-15 09:28:10.989270] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:02.085 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:02.085 09:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:02.085 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:02.085 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:02.344 [2024-07-15 09:28:11.229871] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:02.344 [2024-07-15 09:28:11.249140] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:02.603 [2024-07-15 09:28:11.374012] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:02.603 [2024-07-15 09:28:11.375421] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:02.603 [2024-07-15 09:28:11.375447] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:02.603 [2024-07-15 09:28:11.375456] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:02.603 [2024-07-15 09:28:11.381308] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1883c50 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.603 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.863 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.863 "name": "raid_bdev1", 00:24:02.863 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:02.863 "strip_size_kb": 0, 00:24:02.863 "state": "online", 00:24:02.863 "raid_level": "raid1", 00:24:02.863 "superblock": true, 00:24:02.863 "num_base_bdevs": 2, 00:24:02.863 "num_base_bdevs_discovered": 1, 00:24:02.863 "num_base_bdevs_operational": 1, 00:24:02.863 "base_bdevs_list": [ 00:24:02.863 { 00:24:02.863 "name": null, 00:24:02.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.863 "is_configured": false, 00:24:02.863 "data_offset": 2048, 00:24:02.863 "data_size": 63488 00:24:02.863 }, 00:24:02.863 { 00:24:02.863 "name": "BaseBdev2", 00:24:02.863 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:02.863 "is_configured": true, 00:24:02.863 "data_offset": 2048, 00:24:02.863 "data_size": 63488 00:24:02.863 } 00:24:02.863 ] 00:24:02.863 }' 00:24:02.863 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.863 09:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.430 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.690 "name": "raid_bdev1", 00:24:03.690 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:03.690 "strip_size_kb": 0, 00:24:03.690 "state": "online", 00:24:03.690 "raid_level": "raid1", 00:24:03.690 "superblock": true, 00:24:03.690 "num_base_bdevs": 2, 00:24:03.690 "num_base_bdevs_discovered": 1, 00:24:03.690 "num_base_bdevs_operational": 1, 00:24:03.690 "base_bdevs_list": [ 00:24:03.690 { 00:24:03.690 "name": null, 00:24:03.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.690 "is_configured": false, 00:24:03.690 "data_offset": 2048, 00:24:03.690 "data_size": 63488 00:24:03.690 }, 00:24:03.690 { 00:24:03.690 "name": "BaseBdev2", 00:24:03.690 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:03.690 "is_configured": true, 00:24:03.690 "data_offset": 2048, 00:24:03.690 "data_size": 63488 00:24:03.690 } 00:24:03.690 ] 00:24:03.690 }' 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:03.690 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:03.949 [2024-07-15 09:28:12.837570] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:03.949 [2024-07-15 09:28:12.896265] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1883e60 00:24:03.949 09:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:03.949 [2024-07-15 09:28:12.897734] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:04.207 [2024-07-15 09:28:13.023875] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:04.207 [2024-07-15 09:28:13.024274] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:04.467 [2024-07-15 09:28:13.253058] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:04.467 [2024-07-15 09:28:13.253324] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:04.725 [2024-07-15 09:28:13.509146] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:04.984 [2024-07-15 09:28:13.727569] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:04.984 [2024-07-15 09:28:13.727746] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.984 09:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.244 [2024-07-15 09:28:14.101622] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:05.244 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.244 "name": "raid_bdev1", 00:24:05.244 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:05.244 "strip_size_kb": 0, 00:24:05.244 "state": "online", 00:24:05.244 "raid_level": "raid1", 00:24:05.244 "superblock": true, 00:24:05.244 "num_base_bdevs": 2, 00:24:05.244 "num_base_bdevs_discovered": 2, 00:24:05.244 "num_base_bdevs_operational": 2, 00:24:05.244 "process": { 00:24:05.244 "type": "rebuild", 00:24:05.244 "target": "spare", 00:24:05.244 "progress": { 00:24:05.244 "blocks": 14336, 00:24:05.244 "percent": 22 00:24:05.244 } 00:24:05.244 }, 00:24:05.244 "base_bdevs_list": [ 00:24:05.244 { 00:24:05.244 "name": "spare", 00:24:05.244 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:05.244 "is_configured": true, 00:24:05.244 "data_offset": 2048, 00:24:05.244 "data_size": 63488 00:24:05.244 }, 00:24:05.244 { 00:24:05.244 "name": "BaseBdev2", 00:24:05.244 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:05.244 "is_configured": true, 00:24:05.244 "data_offset": 2048, 00:24:05.244 "data_size": 63488 00:24:05.244 } 00:24:05.244 ] 00:24:05.244 }' 00:24:05.244 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:05.504 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=838 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.504 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.504 [2024-07-15 09:28:14.313240] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.763 "name": "raid_bdev1", 00:24:05.763 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:05.763 "strip_size_kb": 0, 00:24:05.763 "state": "online", 00:24:05.763 "raid_level": "raid1", 00:24:05.763 "superblock": true, 00:24:05.763 "num_base_bdevs": 2, 00:24:05.763 "num_base_bdevs_discovered": 2, 00:24:05.763 "num_base_bdevs_operational": 2, 00:24:05.763 "process": { 00:24:05.763 "type": "rebuild", 00:24:05.763 "target": "spare", 00:24:05.763 "progress": { 00:24:05.763 "blocks": 18432, 00:24:05.763 "percent": 29 00:24:05.763 } 00:24:05.763 }, 00:24:05.763 "base_bdevs_list": [ 00:24:05.763 { 00:24:05.763 "name": "spare", 00:24:05.763 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:05.763 "is_configured": true, 00:24:05.763 "data_offset": 2048, 00:24:05.763 "data_size": 63488 00:24:05.763 }, 00:24:05.763 { 00:24:05.763 "name": "BaseBdev2", 00:24:05.763 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:05.763 "is_configured": true, 00:24:05.763 "data_offset": 2048, 00:24:05.763 "data_size": 63488 00:24:05.763 } 00:24:05.763 ] 00:24:05.763 }' 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.763 [2024-07-15 09:28:14.562683] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.763 09:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:05.763 [2024-07-15 09:28:14.682921] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:06.712 [2024-07-15 09:28:15.350869] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.712 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.971 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.971 "name": "raid_bdev1", 00:24:06.971 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:06.971 "strip_size_kb": 0, 00:24:06.971 "state": "online", 00:24:06.971 "raid_level": "raid1", 00:24:06.971 "superblock": true, 00:24:06.971 "num_base_bdevs": 2, 00:24:06.971 "num_base_bdevs_discovered": 2, 00:24:06.971 "num_base_bdevs_operational": 2, 00:24:06.971 "process": { 00:24:06.971 "type": "rebuild", 00:24:06.971 "target": "spare", 00:24:06.971 "progress": { 00:24:06.971 "blocks": 40960, 00:24:06.971 "percent": 64 00:24:06.971 } 00:24:06.971 }, 00:24:06.971 "base_bdevs_list": [ 00:24:06.971 { 00:24:06.971 "name": "spare", 00:24:06.971 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:06.971 "is_configured": true, 00:24:06.971 "data_offset": 2048, 00:24:06.971 "data_size": 63488 00:24:06.971 }, 00:24:06.971 { 00:24:06.971 "name": "BaseBdev2", 00:24:06.971 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:06.971 "is_configured": true, 00:24:06.971 "data_offset": 2048, 00:24:06.971 "data_size": 63488 00:24:06.971 } 00:24:06.971 ] 00:24:06.971 }' 00:24:06.972 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.972 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:06.972 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.230 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.230 09:28:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.168 09:28:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.168 [2024-07-15 09:28:16.964101] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:08.168 [2024-07-15 09:28:17.072399] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:08.168 [2024-07-15 09:28:17.074242] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:08.427 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.427 "name": "raid_bdev1", 00:24:08.427 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:08.427 "strip_size_kb": 0, 00:24:08.427 "state": "online", 00:24:08.427 "raid_level": "raid1", 00:24:08.427 "superblock": true, 00:24:08.427 "num_base_bdevs": 2, 00:24:08.427 "num_base_bdevs_discovered": 2, 00:24:08.427 "num_base_bdevs_operational": 2, 00:24:08.427 "base_bdevs_list": [ 00:24:08.427 { 00:24:08.427 "name": "spare", 00:24:08.427 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:08.427 "is_configured": true, 00:24:08.427 "data_offset": 2048, 00:24:08.427 "data_size": 63488 00:24:08.427 }, 00:24:08.427 { 00:24:08.427 "name": "BaseBdev2", 00:24:08.427 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:08.427 "is_configured": true, 00:24:08.427 "data_offset": 2048, 00:24:08.427 "data_size": 63488 00:24:08.427 } 00:24:08.427 ] 00:24:08.427 }' 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.428 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.688 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.688 "name": "raid_bdev1", 00:24:08.688 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:08.688 "strip_size_kb": 0, 00:24:08.688 "state": "online", 00:24:08.688 "raid_level": "raid1", 00:24:08.688 "superblock": true, 00:24:08.688 "num_base_bdevs": 2, 00:24:08.688 "num_base_bdevs_discovered": 2, 00:24:08.688 "num_base_bdevs_operational": 2, 00:24:08.688 "base_bdevs_list": [ 00:24:08.688 { 00:24:08.688 "name": "spare", 00:24:08.688 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:08.688 "is_configured": true, 00:24:08.688 "data_offset": 2048, 00:24:08.688 "data_size": 63488 00:24:08.688 }, 00:24:08.688 { 00:24:08.688 "name": "BaseBdev2", 00:24:08.688 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:08.688 "is_configured": true, 00:24:08.688 "data_offset": 2048, 00:24:08.688 "data_size": 63488 00:24:08.688 } 00:24:08.688 ] 00:24:08.688 }' 00:24:08.688 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.688 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:08.688 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.947 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.948 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.948 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.948 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.948 "name": "raid_bdev1", 00:24:08.948 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:08.948 "strip_size_kb": 0, 00:24:08.948 "state": "online", 00:24:08.948 "raid_level": "raid1", 00:24:08.948 "superblock": true, 00:24:08.948 "num_base_bdevs": 2, 00:24:08.948 "num_base_bdevs_discovered": 2, 00:24:08.948 "num_base_bdevs_operational": 2, 00:24:08.948 "base_bdevs_list": [ 00:24:08.948 { 00:24:08.948 "name": "spare", 00:24:08.948 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:08.948 "is_configured": true, 00:24:08.948 "data_offset": 2048, 00:24:08.948 "data_size": 63488 00:24:08.948 }, 00:24:08.948 { 00:24:08.948 "name": "BaseBdev2", 00:24:08.948 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:08.948 "is_configured": true, 00:24:08.948 "data_offset": 2048, 00:24:08.948 "data_size": 63488 00:24:08.948 } 00:24:08.948 ] 00:24:08.948 }' 00:24:08.948 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.948 09:28:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:09.513 09:28:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:09.770 [2024-07-15 09:28:18.681411] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:09.770 [2024-07-15 09:28:18.681441] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:10.027 00:24:10.027 Latency(us) 00:24:10.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:10.027 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:10.027 raid_bdev1 : 10.35 105.52 316.57 0.00 0.00 12610.08 288.50 117622.87 00:24:10.027 =================================================================================================================== 00:24:10.028 Total : 105.52 316.57 0.00 0.00 12610.08 288.50 117622.87 00:24:10.028 [2024-07-15 09:28:18.733522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:10.028 [2024-07-15 09:28:18.733550] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:10.028 [2024-07-15 09:28:18.733624] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:10.028 [2024-07-15 09:28:18.733636] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1883070 name raid_bdev1, state offline 00:24:10.028 0 00:24:10.028 09:28:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.028 09:28:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:10.286 09:28:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:10.286 09:28:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:10.286 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:10.286 /dev/nbd0 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:10.543 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:10.544 1+0 records in 00:24:10.544 1+0 records out 00:24:10.544 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030111 s, 13.6 MB/s 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:10.544 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:10.801 /dev/nbd1 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:10.801 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:10.802 1+0 records in 00:24:10.802 1+0 records out 00:24:10.802 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031881 s, 12.8 MB/s 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:10.802 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:11.059 09:28:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:11.317 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:11.575 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:11.832 [2024-07-15 09:28:20.649666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:11.832 [2024-07-15 09:28:20.649713] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:11.832 [2024-07-15 09:28:20.649739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e3e70 00:24:11.832 [2024-07-15 09:28:20.649752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:11.832 [2024-07-15 09:28:20.651377] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:11.832 [2024-07-15 09:28:20.651405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:11.832 [2024-07-15 09:28:20.651480] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:11.832 [2024-07-15 09:28:20.651505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:11.832 [2024-07-15 09:28:20.651604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:11.832 spare 00:24:11.832 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:11.832 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.832 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.832 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.833 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.833 [2024-07-15 09:28:20.751917] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1883490 00:24:11.833 [2024-07-15 09:28:20.751938] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:11.833 [2024-07-15 09:28:20.752125] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e41d0 00:24:11.833 [2024-07-15 09:28:20.752272] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1883490 00:24:11.833 [2024-07-15 09:28:20.752282] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1883490 00:24:11.833 [2024-07-15 09:28:20.752387] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:12.090 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.090 "name": "raid_bdev1", 00:24:12.090 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:12.090 "strip_size_kb": 0, 00:24:12.090 "state": "online", 00:24:12.090 "raid_level": "raid1", 00:24:12.090 "superblock": true, 00:24:12.090 "num_base_bdevs": 2, 00:24:12.090 "num_base_bdevs_discovered": 2, 00:24:12.090 "num_base_bdevs_operational": 2, 00:24:12.090 "base_bdevs_list": [ 00:24:12.090 { 00:24:12.090 "name": "spare", 00:24:12.090 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:12.090 "is_configured": true, 00:24:12.090 "data_offset": 2048, 00:24:12.090 "data_size": 63488 00:24:12.090 }, 00:24:12.090 { 00:24:12.090 "name": "BaseBdev2", 00:24:12.090 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:12.090 "is_configured": true, 00:24:12.090 "data_offset": 2048, 00:24:12.090 "data_size": 63488 00:24:12.090 } 00:24:12.090 ] 00:24:12.090 }' 00:24:12.090 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.090 09:28:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.657 "name": "raid_bdev1", 00:24:12.657 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:12.657 "strip_size_kb": 0, 00:24:12.657 "state": "online", 00:24:12.657 "raid_level": "raid1", 00:24:12.657 "superblock": true, 00:24:12.657 "num_base_bdevs": 2, 00:24:12.657 "num_base_bdevs_discovered": 2, 00:24:12.657 "num_base_bdevs_operational": 2, 00:24:12.657 "base_bdevs_list": [ 00:24:12.657 { 00:24:12.657 "name": "spare", 00:24:12.657 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:12.657 "is_configured": true, 00:24:12.657 "data_offset": 2048, 00:24:12.657 "data_size": 63488 00:24:12.657 }, 00:24:12.657 { 00:24:12.657 "name": "BaseBdev2", 00:24:12.657 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:12.657 "is_configured": true, 00:24:12.657 "data_offset": 2048, 00:24:12.657 "data_size": 63488 00:24:12.657 } 00:24:12.657 ] 00:24:12.657 }' 00:24:12.657 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.914 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:12.914 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.914 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.914 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.914 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:13.172 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:13.172 09:28:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:13.429 [2024-07-15 09:28:22.162102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:13.429 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:13.429 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:13.429 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.430 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.688 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.688 "name": "raid_bdev1", 00:24:13.688 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:13.688 "strip_size_kb": 0, 00:24:13.688 "state": "online", 00:24:13.688 "raid_level": "raid1", 00:24:13.688 "superblock": true, 00:24:13.688 "num_base_bdevs": 2, 00:24:13.688 "num_base_bdevs_discovered": 1, 00:24:13.688 "num_base_bdevs_operational": 1, 00:24:13.688 "base_bdevs_list": [ 00:24:13.688 { 00:24:13.688 "name": null, 00:24:13.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.688 "is_configured": false, 00:24:13.688 "data_offset": 2048, 00:24:13.688 "data_size": 63488 00:24:13.688 }, 00:24:13.688 { 00:24:13.688 "name": "BaseBdev2", 00:24:13.688 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:13.688 "is_configured": true, 00:24:13.688 "data_offset": 2048, 00:24:13.688 "data_size": 63488 00:24:13.688 } 00:24:13.688 ] 00:24:13.688 }' 00:24:13.688 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.688 09:28:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:14.308 09:28:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:14.565 [2024-07-15 09:28:23.273204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:14.565 [2024-07-15 09:28:23.273356] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:14.565 [2024-07-15 09:28:23.273373] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:14.565 [2024-07-15 09:28:23.273400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:14.565 [2024-07-15 09:28:23.278633] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e5f40 00:24:14.565 [2024-07-15 09:28:23.280957] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:14.565 09:28:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.495 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.753 "name": "raid_bdev1", 00:24:15.753 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:15.753 "strip_size_kb": 0, 00:24:15.753 "state": "online", 00:24:15.753 "raid_level": "raid1", 00:24:15.753 "superblock": true, 00:24:15.753 "num_base_bdevs": 2, 00:24:15.753 "num_base_bdevs_discovered": 2, 00:24:15.753 "num_base_bdevs_operational": 2, 00:24:15.753 "process": { 00:24:15.753 "type": "rebuild", 00:24:15.753 "target": "spare", 00:24:15.753 "progress": { 00:24:15.753 "blocks": 24576, 00:24:15.753 "percent": 38 00:24:15.753 } 00:24:15.753 }, 00:24:15.753 "base_bdevs_list": [ 00:24:15.753 { 00:24:15.753 "name": "spare", 00:24:15.753 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:15.753 "is_configured": true, 00:24:15.753 "data_offset": 2048, 00:24:15.753 "data_size": 63488 00:24:15.753 }, 00:24:15.753 { 00:24:15.753 "name": "BaseBdev2", 00:24:15.753 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:15.753 "is_configured": true, 00:24:15.753 "data_offset": 2048, 00:24:15.753 "data_size": 63488 00:24:15.753 } 00:24:15.753 ] 00:24:15.753 }' 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.753 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:16.011 [2024-07-15 09:28:24.865634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:16.011 [2024-07-15 09:28:24.893637] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:16.011 [2024-07-15 09:28:24.893682] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.011 [2024-07-15 09:28:24.893698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:16.011 [2024-07-15 09:28:24.893706] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.011 09:28:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.269 09:28:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.269 "name": "raid_bdev1", 00:24:16.269 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:16.269 "strip_size_kb": 0, 00:24:16.269 "state": "online", 00:24:16.269 "raid_level": "raid1", 00:24:16.269 "superblock": true, 00:24:16.269 "num_base_bdevs": 2, 00:24:16.269 "num_base_bdevs_discovered": 1, 00:24:16.269 "num_base_bdevs_operational": 1, 00:24:16.269 "base_bdevs_list": [ 00:24:16.269 { 00:24:16.269 "name": null, 00:24:16.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.269 "is_configured": false, 00:24:16.269 "data_offset": 2048, 00:24:16.269 "data_size": 63488 00:24:16.269 }, 00:24:16.269 { 00:24:16.269 "name": "BaseBdev2", 00:24:16.269 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:16.269 "is_configured": true, 00:24:16.269 "data_offset": 2048, 00:24:16.269 "data_size": 63488 00:24:16.269 } 00:24:16.269 ] 00:24:16.269 }' 00:24:16.269 09:28:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.269 09:28:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:16.834 09:28:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:17.092 [2024-07-15 09:28:25.970002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:17.092 [2024-07-15 09:28:25.970054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:17.092 [2024-07-15 09:28:25.970080] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d2490 00:24:17.092 [2024-07-15 09:28:25.970098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:17.092 [2024-07-15 09:28:25.970463] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:17.092 [2024-07-15 09:28:25.970481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:17.092 [2024-07-15 09:28:25.970565] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:17.092 [2024-07-15 09:28:25.970577] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:17.092 [2024-07-15 09:28:25.970588] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:17.092 [2024-07-15 09:28:25.970608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:17.092 [2024-07-15 09:28:25.975880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e5f40 00:24:17.092 spare 00:24:17.092 [2024-07-15 09:28:25.977346] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:17.092 09:28:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:18.465 09:28:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:18.465 09:28:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.465 09:28:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:18.465 09:28:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:18.465 09:28:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.465 "name": "raid_bdev1", 00:24:18.465 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:18.465 "strip_size_kb": 0, 00:24:18.465 "state": "online", 00:24:18.465 "raid_level": "raid1", 00:24:18.465 "superblock": true, 00:24:18.465 "num_base_bdevs": 2, 00:24:18.465 "num_base_bdevs_discovered": 2, 00:24:18.465 "num_base_bdevs_operational": 2, 00:24:18.465 "process": { 00:24:18.465 "type": "rebuild", 00:24:18.465 "target": "spare", 00:24:18.465 "progress": { 00:24:18.465 "blocks": 24576, 00:24:18.465 "percent": 38 00:24:18.465 } 00:24:18.465 }, 00:24:18.465 "base_bdevs_list": [ 00:24:18.465 { 00:24:18.465 "name": "spare", 00:24:18.465 "uuid": "57cf90c4-19e2-5fc2-882d-d204bcab4a32", 00:24:18.465 "is_configured": true, 00:24:18.465 "data_offset": 2048, 00:24:18.465 "data_size": 63488 00:24:18.465 }, 00:24:18.465 { 00:24:18.465 "name": "BaseBdev2", 00:24:18.465 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:18.465 "is_configured": true, 00:24:18.465 "data_offset": 2048, 00:24:18.465 "data_size": 63488 00:24:18.465 } 00:24:18.465 ] 00:24:18.465 }' 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:18.465 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:18.724 [2024-07-15 09:28:27.540611] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:18.724 [2024-07-15 09:28:27.590066] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:18.724 [2024-07-15 09:28:27.590111] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.724 [2024-07-15 09:28:27.590127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:18.724 [2024-07-15 09:28:27.590135] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.724 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.983 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:18.983 "name": "raid_bdev1", 00:24:18.983 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:18.983 "strip_size_kb": 0, 00:24:18.983 "state": "online", 00:24:18.983 "raid_level": "raid1", 00:24:18.983 "superblock": true, 00:24:18.983 "num_base_bdevs": 2, 00:24:18.983 "num_base_bdevs_discovered": 1, 00:24:18.983 "num_base_bdevs_operational": 1, 00:24:18.983 "base_bdevs_list": [ 00:24:18.983 { 00:24:18.983 "name": null, 00:24:18.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.983 "is_configured": false, 00:24:18.983 "data_offset": 2048, 00:24:18.983 "data_size": 63488 00:24:18.983 }, 00:24:18.983 { 00:24:18.983 "name": "BaseBdev2", 00:24:18.983 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:18.983 "is_configured": true, 00:24:18.983 "data_offset": 2048, 00:24:18.983 "data_size": 63488 00:24:18.983 } 00:24:18.983 ] 00:24:18.983 }' 00:24:18.983 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:18.983 09:28:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.549 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.807 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:19.807 "name": "raid_bdev1", 00:24:19.807 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:19.807 "strip_size_kb": 0, 00:24:19.807 "state": "online", 00:24:19.807 "raid_level": "raid1", 00:24:19.807 "superblock": true, 00:24:19.807 "num_base_bdevs": 2, 00:24:19.807 "num_base_bdevs_discovered": 1, 00:24:19.807 "num_base_bdevs_operational": 1, 00:24:19.807 "base_bdevs_list": [ 00:24:19.807 { 00:24:19.807 "name": null, 00:24:19.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.807 "is_configured": false, 00:24:19.807 "data_offset": 2048, 00:24:19.807 "data_size": 63488 00:24:19.807 }, 00:24:19.807 { 00:24:19.807 "name": "BaseBdev2", 00:24:19.807 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:19.807 "is_configured": true, 00:24:19.807 "data_offset": 2048, 00:24:19.807 "data_size": 63488 00:24:19.807 } 00:24:19.807 ] 00:24:19.807 }' 00:24:19.807 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:20.065 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:20.065 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:20.065 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:20.065 09:28:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:20.323 09:28:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:20.323 [2024-07-15 09:28:29.271370] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:20.323 [2024-07-15 09:28:29.271414] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.323 [2024-07-15 09:28:29.271434] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e3750 00:24:20.323 [2024-07-15 09:28:29.271447] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.323 [2024-07-15 09:28:29.271793] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.323 [2024-07-15 09:28:29.271813] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:20.323 [2024-07-15 09:28:29.271878] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:20.323 [2024-07-15 09:28:29.271894] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:20.323 [2024-07-15 09:28:29.271907] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:20.323 BaseBdev1 00:24:20.581 09:28:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.515 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.773 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.773 "name": "raid_bdev1", 00:24:21.773 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:21.773 "strip_size_kb": 0, 00:24:21.773 "state": "online", 00:24:21.773 "raid_level": "raid1", 00:24:21.773 "superblock": true, 00:24:21.773 "num_base_bdevs": 2, 00:24:21.773 "num_base_bdevs_discovered": 1, 00:24:21.773 "num_base_bdevs_operational": 1, 00:24:21.773 "base_bdevs_list": [ 00:24:21.773 { 00:24:21.773 "name": null, 00:24:21.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:21.773 "is_configured": false, 00:24:21.773 "data_offset": 2048, 00:24:21.773 "data_size": 63488 00:24:21.773 }, 00:24:21.773 { 00:24:21.773 "name": "BaseBdev2", 00:24:21.773 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:21.773 "is_configured": true, 00:24:21.773 "data_offset": 2048, 00:24:21.773 "data_size": 63488 00:24:21.773 } 00:24:21.773 ] 00:24:21.773 }' 00:24:21.773 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.773 09:28:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.336 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.594 "name": "raid_bdev1", 00:24:22.594 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:22.594 "strip_size_kb": 0, 00:24:22.594 "state": "online", 00:24:22.594 "raid_level": "raid1", 00:24:22.594 "superblock": true, 00:24:22.594 "num_base_bdevs": 2, 00:24:22.594 "num_base_bdevs_discovered": 1, 00:24:22.594 "num_base_bdevs_operational": 1, 00:24:22.594 "base_bdevs_list": [ 00:24:22.594 { 00:24:22.594 "name": null, 00:24:22.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.594 "is_configured": false, 00:24:22.594 "data_offset": 2048, 00:24:22.594 "data_size": 63488 00:24:22.594 }, 00:24:22.594 { 00:24:22.594 "name": "BaseBdev2", 00:24:22.594 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:22.594 "is_configured": true, 00:24:22.594 "data_offset": 2048, 00:24:22.594 "data_size": 63488 00:24:22.594 } 00:24:22.594 ] 00:24:22.594 }' 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:22.594 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:22.852 [2024-07-15 09:28:31.698178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:22.852 [2024-07-15 09:28:31.698305] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:22.852 [2024-07-15 09:28:31.698321] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:22.852 request: 00:24:22.852 { 00:24:22.852 "base_bdev": "BaseBdev1", 00:24:22.852 "raid_bdev": "raid_bdev1", 00:24:22.852 "method": "bdev_raid_add_base_bdev", 00:24:22.852 "req_id": 1 00:24:22.852 } 00:24:22.852 Got JSON-RPC error response 00:24:22.852 response: 00:24:22.852 { 00:24:22.852 "code": -22, 00:24:22.852 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:22.852 } 00:24:22.852 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:22.852 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:22.852 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:22.852 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:22.852 09:28:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.784 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.042 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.042 "name": "raid_bdev1", 00:24:24.042 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:24.042 "strip_size_kb": 0, 00:24:24.042 "state": "online", 00:24:24.042 "raid_level": "raid1", 00:24:24.042 "superblock": true, 00:24:24.042 "num_base_bdevs": 2, 00:24:24.042 "num_base_bdevs_discovered": 1, 00:24:24.042 "num_base_bdevs_operational": 1, 00:24:24.042 "base_bdevs_list": [ 00:24:24.042 { 00:24:24.042 "name": null, 00:24:24.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.042 "is_configured": false, 00:24:24.042 "data_offset": 2048, 00:24:24.042 "data_size": 63488 00:24:24.042 }, 00:24:24.042 { 00:24:24.042 "name": "BaseBdev2", 00:24:24.042 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:24.042 "is_configured": true, 00:24:24.042 "data_offset": 2048, 00:24:24.042 "data_size": 63488 00:24:24.042 } 00:24:24.042 ] 00:24:24.042 }' 00:24:24.042 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.042 09:28:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:24.607 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:24.607 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.607 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:24.607 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:24.607 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:24.865 "name": "raid_bdev1", 00:24:24.865 "uuid": "8431f819-311c-4417-9f5e-5026f3b78f00", 00:24:24.865 "strip_size_kb": 0, 00:24:24.865 "state": "online", 00:24:24.865 "raid_level": "raid1", 00:24:24.865 "superblock": true, 00:24:24.865 "num_base_bdevs": 2, 00:24:24.865 "num_base_bdevs_discovered": 1, 00:24:24.865 "num_base_bdevs_operational": 1, 00:24:24.865 "base_bdevs_list": [ 00:24:24.865 { 00:24:24.865 "name": null, 00:24:24.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.865 "is_configured": false, 00:24:24.865 "data_offset": 2048, 00:24:24.865 "data_size": 63488 00:24:24.865 }, 00:24:24.865 { 00:24:24.865 "name": "BaseBdev2", 00:24:24.865 "uuid": "e50fc276-2d5b-57d3-a26b-d9775aac5e2d", 00:24:24.865 "is_configured": true, 00:24:24.865 "data_offset": 2048, 00:24:24.865 "data_size": 63488 00:24:24.865 } 00:24:24.865 ] 00:24:24.865 }' 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:24.865 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 202016 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 202016 ']' 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 202016 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 202016 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 202016' 00:24:25.122 killing process with pid 202016 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 202016 00:24:25.122 Received shutdown signal, test time was about 25.467143 seconds 00:24:25.122 00:24:25.122 Latency(us) 00:24:25.122 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:25.122 =================================================================================================================== 00:24:25.122 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:25.122 [2024-07-15 09:28:33.885162] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:25.122 [2024-07-15 09:28:33.885252] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:25.122 [2024-07-15 09:28:33.885295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:25.122 [2024-07-15 09:28:33.885307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1883490 name raid_bdev1, state offline 00:24:25.122 09:28:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 202016 00:24:25.122 [2024-07-15 09:28:33.906105] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:25.380 00:24:25.380 real 0m29.888s 00:24:25.380 user 0m47.109s 00:24:25.380 sys 0m4.356s 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:25.380 ************************************ 00:24:25.380 END TEST raid_rebuild_test_sb_io 00:24:25.380 ************************************ 00:24:25.380 09:28:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:25.380 09:28:34 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:25.380 09:28:34 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:25.380 09:28:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:25.380 09:28:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:25.380 09:28:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:25.380 ************************************ 00:24:25.380 START TEST raid_rebuild_test 00:24:25.380 ************************************ 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=206320 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 206320 /var/tmp/spdk-raid.sock 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 206320 ']' 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:25.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:25.380 09:28:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:25.380 [2024-07-15 09:28:34.269954] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:24:25.380 [2024-07-15 09:28:34.270020] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid206320 ] 00:24:25.380 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:25.380 Zero copy mechanism will not be used. 00:24:25.638 [2024-07-15 09:28:34.413835] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:25.638 [2024-07-15 09:28:34.536864] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:25.896 [2024-07-15 09:28:34.601456] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:25.896 [2024-07-15 09:28:34.601489] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:26.463 09:28:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:26.463 09:28:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:26.463 09:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:26.464 09:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:26.722 BaseBdev1_malloc 00:24:26.722 09:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:26.981 [2024-07-15 09:28:35.755186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:26.981 [2024-07-15 09:28:35.755232] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:26.981 [2024-07-15 09:28:35.755258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e3ed40 00:24:26.981 [2024-07-15 09:28:35.755271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:26.981 [2024-07-15 09:28:35.757021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:26.981 [2024-07-15 09:28:35.757050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:26.981 BaseBdev1 00:24:26.981 09:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:26.981 09:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:27.239 BaseBdev2_malloc 00:24:27.239 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:27.498 [2024-07-15 09:28:36.249334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:27.498 [2024-07-15 09:28:36.249382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:27.498 [2024-07-15 09:28:36.249406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e3f860 00:24:27.498 [2024-07-15 09:28:36.249419] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:27.498 [2024-07-15 09:28:36.251022] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:27.498 [2024-07-15 09:28:36.251050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:27.498 BaseBdev2 00:24:27.498 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:27.498 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:27.756 BaseBdev3_malloc 00:24:27.756 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:28.014 [2024-07-15 09:28:36.744467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:28.014 [2024-07-15 09:28:36.744512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.014 [2024-07-15 09:28:36.744535] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fec8f0 00:24:28.015 [2024-07-15 09:28:36.744549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.015 [2024-07-15 09:28:36.746131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.015 [2024-07-15 09:28:36.746164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:28.015 BaseBdev3 00:24:28.015 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:28.015 09:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:28.274 BaseBdev4_malloc 00:24:28.274 09:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:28.274 [2024-07-15 09:28:37.219545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:28.274 [2024-07-15 09:28:37.219587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.274 [2024-07-15 09:28:37.219609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1febad0 00:24:28.274 [2024-07-15 09:28:37.219622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.274 [2024-07-15 09:28:37.221125] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.274 [2024-07-15 09:28:37.221154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:28.274 BaseBdev4 00:24:28.533 09:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:28.533 spare_malloc 00:24:28.791 09:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:28.791 spare_delay 00:24:28.791 09:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:29.050 [2024-07-15 09:28:37.958075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:29.050 [2024-07-15 09:28:37.958123] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.050 [2024-07-15 09:28:37.958146] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff05b0 00:24:29.050 [2024-07-15 09:28:37.958160] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.050 [2024-07-15 09:28:37.959734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.050 [2024-07-15 09:28:37.959763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:29.050 spare 00:24:29.050 09:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:29.308 [2024-07-15 09:28:38.202750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:29.308 [2024-07-15 09:28:38.204096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:29.308 [2024-07-15 09:28:38.204151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:29.308 [2024-07-15 09:28:38.204196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:29.308 [2024-07-15 09:28:38.204276] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f6f8a0 00:24:29.308 [2024-07-15 09:28:38.204287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:29.308 [2024-07-15 09:28:38.204502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe9e10 00:24:29.308 [2024-07-15 09:28:38.204652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f6f8a0 00:24:29.308 [2024-07-15 09:28:38.204662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f6f8a0 00:24:29.308 [2024-07-15 09:28:38.204777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.308 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.567 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.567 "name": "raid_bdev1", 00:24:29.567 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:29.567 "strip_size_kb": 0, 00:24:29.567 "state": "online", 00:24:29.567 "raid_level": "raid1", 00:24:29.567 "superblock": false, 00:24:29.567 "num_base_bdevs": 4, 00:24:29.567 "num_base_bdevs_discovered": 4, 00:24:29.567 "num_base_bdevs_operational": 4, 00:24:29.567 "base_bdevs_list": [ 00:24:29.567 { 00:24:29.567 "name": "BaseBdev1", 00:24:29.567 "uuid": "8949aa2d-4834-5401-a06d-0a229caa7737", 00:24:29.567 "is_configured": true, 00:24:29.567 "data_offset": 0, 00:24:29.567 "data_size": 65536 00:24:29.567 }, 00:24:29.567 { 00:24:29.567 "name": "BaseBdev2", 00:24:29.567 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:29.567 "is_configured": true, 00:24:29.567 "data_offset": 0, 00:24:29.567 "data_size": 65536 00:24:29.567 }, 00:24:29.567 { 00:24:29.567 "name": "BaseBdev3", 00:24:29.567 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:29.567 "is_configured": true, 00:24:29.567 "data_offset": 0, 00:24:29.567 "data_size": 65536 00:24:29.567 }, 00:24:29.567 { 00:24:29.567 "name": "BaseBdev4", 00:24:29.567 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:29.567 "is_configured": true, 00:24:29.567 "data_offset": 0, 00:24:29.567 "data_size": 65536 00:24:29.567 } 00:24:29.567 ] 00:24:29.567 }' 00:24:29.567 09:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.567 09:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:30.134 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:30.134 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:30.392 [2024-07-15 09:28:39.285897] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:30.392 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:30.392 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.392 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.651 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:30.910 [2024-07-15 09:28:39.786975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe9e10 00:24:30.910 /dev/nbd0 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:30.910 1+0 records in 00:24:30.910 1+0 records out 00:24:30.910 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268713 s, 15.2 MB/s 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:30.910 09:28:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:39.062 65536+0 records in 00:24:39.062 65536+0 records out 00:24:39.062 33554432 bytes (34 MB, 32 MiB) copied, 7.44005 s, 4.5 MB/s 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:39.062 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:24:39.063 [2024-07-15 09:28:47.663597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:39.063 [2024-07-15 09:28:47.912298] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.063 09:28:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.321 09:28:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.321 "name": "raid_bdev1", 00:24:39.321 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:39.321 "strip_size_kb": 0, 00:24:39.321 "state": "online", 00:24:39.321 "raid_level": "raid1", 00:24:39.321 "superblock": false, 00:24:39.321 "num_base_bdevs": 4, 00:24:39.321 "num_base_bdevs_discovered": 3, 00:24:39.321 "num_base_bdevs_operational": 3, 00:24:39.321 "base_bdevs_list": [ 00:24:39.321 { 00:24:39.321 "name": null, 00:24:39.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.321 "is_configured": false, 00:24:39.321 "data_offset": 0, 00:24:39.321 "data_size": 65536 00:24:39.321 }, 00:24:39.321 { 00:24:39.321 "name": "BaseBdev2", 00:24:39.321 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:39.321 "is_configured": true, 00:24:39.321 "data_offset": 0, 00:24:39.321 "data_size": 65536 00:24:39.321 }, 00:24:39.321 { 00:24:39.321 "name": "BaseBdev3", 00:24:39.321 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:39.321 "is_configured": true, 00:24:39.321 "data_offset": 0, 00:24:39.321 "data_size": 65536 00:24:39.321 }, 00:24:39.321 { 00:24:39.321 "name": "BaseBdev4", 00:24:39.321 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:39.321 "is_configured": true, 00:24:39.321 "data_offset": 0, 00:24:39.321 "data_size": 65536 00:24:39.321 } 00:24:39.321 ] 00:24:39.321 }' 00:24:39.321 09:28:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.321 09:28:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:39.940 09:28:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:40.199 [2024-07-15 09:28:49.007224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.199 [2024-07-15 09:28:49.011338] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f756b0 00:24:40.199 [2024-07-15 09:28:49.013767] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:40.199 09:28:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:41.132 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.133 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.390 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.390 "name": "raid_bdev1", 00:24:41.390 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:41.390 "strip_size_kb": 0, 00:24:41.390 "state": "online", 00:24:41.390 "raid_level": "raid1", 00:24:41.390 "superblock": false, 00:24:41.390 "num_base_bdevs": 4, 00:24:41.390 "num_base_bdevs_discovered": 4, 00:24:41.390 "num_base_bdevs_operational": 4, 00:24:41.390 "process": { 00:24:41.390 "type": "rebuild", 00:24:41.390 "target": "spare", 00:24:41.390 "progress": { 00:24:41.390 "blocks": 22528, 00:24:41.390 "percent": 34 00:24:41.390 } 00:24:41.390 }, 00:24:41.390 "base_bdevs_list": [ 00:24:41.390 { 00:24:41.390 "name": "spare", 00:24:41.390 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:41.390 "is_configured": true, 00:24:41.390 "data_offset": 0, 00:24:41.390 "data_size": 65536 00:24:41.390 }, 00:24:41.390 { 00:24:41.390 "name": "BaseBdev2", 00:24:41.390 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:41.390 "is_configured": true, 00:24:41.390 "data_offset": 0, 00:24:41.390 "data_size": 65536 00:24:41.390 }, 00:24:41.390 { 00:24:41.390 "name": "BaseBdev3", 00:24:41.390 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:41.390 "is_configured": true, 00:24:41.390 "data_offset": 0, 00:24:41.390 "data_size": 65536 00:24:41.390 }, 00:24:41.390 { 00:24:41.390 "name": "BaseBdev4", 00:24:41.390 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:41.391 "is_configured": true, 00:24:41.391 "data_offset": 0, 00:24:41.391 "data_size": 65536 00:24:41.391 } 00:24:41.391 ] 00:24:41.391 }' 00:24:41.391 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.391 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.391 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.391 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.391 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:41.648 [2024-07-15 09:28:50.448113] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:41.648 [2024-07-15 09:28:50.525772] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:41.648 [2024-07-15 09:28:50.525820] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.648 [2024-07-15 09:28:50.525838] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:41.648 [2024-07-15 09:28:50.525847] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.648 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.906 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.906 "name": "raid_bdev1", 00:24:41.906 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:41.906 "strip_size_kb": 0, 00:24:41.906 "state": "online", 00:24:41.906 "raid_level": "raid1", 00:24:41.906 "superblock": false, 00:24:41.906 "num_base_bdevs": 4, 00:24:41.906 "num_base_bdevs_discovered": 3, 00:24:41.906 "num_base_bdevs_operational": 3, 00:24:41.906 "base_bdevs_list": [ 00:24:41.906 { 00:24:41.906 "name": null, 00:24:41.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.906 "is_configured": false, 00:24:41.906 "data_offset": 0, 00:24:41.906 "data_size": 65536 00:24:41.906 }, 00:24:41.906 { 00:24:41.906 "name": "BaseBdev2", 00:24:41.906 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:41.906 "is_configured": true, 00:24:41.906 "data_offset": 0, 00:24:41.906 "data_size": 65536 00:24:41.906 }, 00:24:41.906 { 00:24:41.906 "name": "BaseBdev3", 00:24:41.906 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:41.906 "is_configured": true, 00:24:41.906 "data_offset": 0, 00:24:41.906 "data_size": 65536 00:24:41.906 }, 00:24:41.906 { 00:24:41.906 "name": "BaseBdev4", 00:24:41.906 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:41.906 "is_configured": true, 00:24:41.906 "data_offset": 0, 00:24:41.906 "data_size": 65536 00:24:41.906 } 00:24:41.906 ] 00:24:41.906 }' 00:24:41.906 09:28:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.906 09:28:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.840 09:28:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.406 "name": "raid_bdev1", 00:24:43.406 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:43.406 "strip_size_kb": 0, 00:24:43.406 "state": "online", 00:24:43.406 "raid_level": "raid1", 00:24:43.406 "superblock": false, 00:24:43.406 "num_base_bdevs": 4, 00:24:43.406 "num_base_bdevs_discovered": 3, 00:24:43.406 "num_base_bdevs_operational": 3, 00:24:43.406 "base_bdevs_list": [ 00:24:43.406 { 00:24:43.406 "name": null, 00:24:43.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.406 "is_configured": false, 00:24:43.406 "data_offset": 0, 00:24:43.406 "data_size": 65536 00:24:43.406 }, 00:24:43.406 { 00:24:43.406 "name": "BaseBdev2", 00:24:43.406 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:43.406 "is_configured": true, 00:24:43.406 "data_offset": 0, 00:24:43.406 "data_size": 65536 00:24:43.406 }, 00:24:43.406 { 00:24:43.406 "name": "BaseBdev3", 00:24:43.406 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:43.406 "is_configured": true, 00:24:43.406 "data_offset": 0, 00:24:43.406 "data_size": 65536 00:24:43.406 }, 00:24:43.406 { 00:24:43.406 "name": "BaseBdev4", 00:24:43.406 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:43.406 "is_configured": true, 00:24:43.406 "data_offset": 0, 00:24:43.406 "data_size": 65536 00:24:43.406 } 00:24:43.406 ] 00:24:43.406 }' 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:43.406 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:43.664 [2024-07-15 09:28:52.491113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:43.664 [2024-07-15 09:28:52.495162] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f756b0 00:24:43.664 [2024-07-15 09:28:52.496734] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:43.664 09:28:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.596 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.852 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:44.852 "name": "raid_bdev1", 00:24:44.852 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:44.852 "strip_size_kb": 0, 00:24:44.853 "state": "online", 00:24:44.853 "raid_level": "raid1", 00:24:44.853 "superblock": false, 00:24:44.853 "num_base_bdevs": 4, 00:24:44.853 "num_base_bdevs_discovered": 4, 00:24:44.853 "num_base_bdevs_operational": 4, 00:24:44.853 "process": { 00:24:44.853 "type": "rebuild", 00:24:44.853 "target": "spare", 00:24:44.853 "progress": { 00:24:44.853 "blocks": 24576, 00:24:44.853 "percent": 37 00:24:44.853 } 00:24:44.853 }, 00:24:44.853 "base_bdevs_list": [ 00:24:44.853 { 00:24:44.853 "name": "spare", 00:24:44.853 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:44.853 "is_configured": true, 00:24:44.853 "data_offset": 0, 00:24:44.853 "data_size": 65536 00:24:44.853 }, 00:24:44.853 { 00:24:44.853 "name": "BaseBdev2", 00:24:44.853 "uuid": "4f0c7017-1c81-59ad-a20c-c3aafe84aa53", 00:24:44.853 "is_configured": true, 00:24:44.853 "data_offset": 0, 00:24:44.853 "data_size": 65536 00:24:44.853 }, 00:24:44.853 { 00:24:44.853 "name": "BaseBdev3", 00:24:44.853 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:44.853 "is_configured": true, 00:24:44.853 "data_offset": 0, 00:24:44.853 "data_size": 65536 00:24:44.853 }, 00:24:44.853 { 00:24:44.853 "name": "BaseBdev4", 00:24:44.853 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:44.853 "is_configured": true, 00:24:44.853 "data_offset": 0, 00:24:44.853 "data_size": 65536 00:24:44.853 } 00:24:44.853 ] 00:24:44.853 }' 00:24:44.853 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:45.110 09:28:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:45.367 [2024-07-15 09:28:54.076617] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:45.367 [2024-07-15 09:28:54.109162] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1f756b0 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.367 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.625 "name": "raid_bdev1", 00:24:45.625 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:45.625 "strip_size_kb": 0, 00:24:45.625 "state": "online", 00:24:45.625 "raid_level": "raid1", 00:24:45.625 "superblock": false, 00:24:45.625 "num_base_bdevs": 4, 00:24:45.625 "num_base_bdevs_discovered": 3, 00:24:45.625 "num_base_bdevs_operational": 3, 00:24:45.625 "process": { 00:24:45.625 "type": "rebuild", 00:24:45.625 "target": "spare", 00:24:45.625 "progress": { 00:24:45.625 "blocks": 36864, 00:24:45.625 "percent": 56 00:24:45.625 } 00:24:45.625 }, 00:24:45.625 "base_bdevs_list": [ 00:24:45.625 { 00:24:45.625 "name": "spare", 00:24:45.625 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:45.625 "is_configured": true, 00:24:45.625 "data_offset": 0, 00:24:45.625 "data_size": 65536 00:24:45.625 }, 00:24:45.625 { 00:24:45.625 "name": null, 00:24:45.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.625 "is_configured": false, 00:24:45.625 "data_offset": 0, 00:24:45.625 "data_size": 65536 00:24:45.625 }, 00:24:45.625 { 00:24:45.625 "name": "BaseBdev3", 00:24:45.625 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:45.625 "is_configured": true, 00:24:45.625 "data_offset": 0, 00:24:45.625 "data_size": 65536 00:24:45.625 }, 00:24:45.625 { 00:24:45.625 "name": "BaseBdev4", 00:24:45.625 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:45.625 "is_configured": true, 00:24:45.625 "data_offset": 0, 00:24:45.625 "data_size": 65536 00:24:45.625 } 00:24:45.625 ] 00:24:45.625 }' 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=878 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.625 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.883 "name": "raid_bdev1", 00:24:45.883 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:45.883 "strip_size_kb": 0, 00:24:45.883 "state": "online", 00:24:45.883 "raid_level": "raid1", 00:24:45.883 "superblock": false, 00:24:45.883 "num_base_bdevs": 4, 00:24:45.883 "num_base_bdevs_discovered": 3, 00:24:45.883 "num_base_bdevs_operational": 3, 00:24:45.883 "process": { 00:24:45.883 "type": "rebuild", 00:24:45.883 "target": "spare", 00:24:45.883 "progress": { 00:24:45.883 "blocks": 43008, 00:24:45.883 "percent": 65 00:24:45.883 } 00:24:45.883 }, 00:24:45.883 "base_bdevs_list": [ 00:24:45.883 { 00:24:45.883 "name": "spare", 00:24:45.883 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:45.883 "is_configured": true, 00:24:45.883 "data_offset": 0, 00:24:45.883 "data_size": 65536 00:24:45.883 }, 00:24:45.883 { 00:24:45.883 "name": null, 00:24:45.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.883 "is_configured": false, 00:24:45.883 "data_offset": 0, 00:24:45.883 "data_size": 65536 00:24:45.883 }, 00:24:45.883 { 00:24:45.883 "name": "BaseBdev3", 00:24:45.883 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:45.883 "is_configured": true, 00:24:45.883 "data_offset": 0, 00:24:45.883 "data_size": 65536 00:24:45.883 }, 00:24:45.883 { 00:24:45.883 "name": "BaseBdev4", 00:24:45.883 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:45.883 "is_configured": true, 00:24:45.883 "data_offset": 0, 00:24:45.883 "data_size": 65536 00:24:45.883 } 00:24:45.883 ] 00:24:45.883 }' 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.883 09:28:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:46.817 [2024-07-15 09:28:55.721660] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:46.818 [2024-07-15 09:28:55.721720] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:46.818 [2024-07-15 09:28:55.721764] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.818 09:28:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.076 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.076 "name": "raid_bdev1", 00:24:47.076 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:47.076 "strip_size_kb": 0, 00:24:47.076 "state": "online", 00:24:47.076 "raid_level": "raid1", 00:24:47.076 "superblock": false, 00:24:47.076 "num_base_bdevs": 4, 00:24:47.076 "num_base_bdevs_discovered": 3, 00:24:47.076 "num_base_bdevs_operational": 3, 00:24:47.076 "base_bdevs_list": [ 00:24:47.076 { 00:24:47.076 "name": "spare", 00:24:47.076 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:47.076 "is_configured": true, 00:24:47.076 "data_offset": 0, 00:24:47.076 "data_size": 65536 00:24:47.076 }, 00:24:47.076 { 00:24:47.076 "name": null, 00:24:47.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.076 "is_configured": false, 00:24:47.076 "data_offset": 0, 00:24:47.076 "data_size": 65536 00:24:47.076 }, 00:24:47.076 { 00:24:47.076 "name": "BaseBdev3", 00:24:47.076 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:47.076 "is_configured": true, 00:24:47.076 "data_offset": 0, 00:24:47.076 "data_size": 65536 00:24:47.076 }, 00:24:47.076 { 00:24:47.076 "name": "BaseBdev4", 00:24:47.076 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:47.076 "is_configured": true, 00:24:47.076 "data_offset": 0, 00:24:47.076 "data_size": 65536 00:24:47.076 } 00:24:47.076 ] 00:24:47.076 }' 00:24:47.076 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.335 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.594 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.594 "name": "raid_bdev1", 00:24:47.594 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:47.594 "strip_size_kb": 0, 00:24:47.594 "state": "online", 00:24:47.594 "raid_level": "raid1", 00:24:47.594 "superblock": false, 00:24:47.594 "num_base_bdevs": 4, 00:24:47.594 "num_base_bdevs_discovered": 3, 00:24:47.595 "num_base_bdevs_operational": 3, 00:24:47.595 "base_bdevs_list": [ 00:24:47.595 { 00:24:47.595 "name": "spare", 00:24:47.595 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:47.595 "is_configured": true, 00:24:47.595 "data_offset": 0, 00:24:47.595 "data_size": 65536 00:24:47.595 }, 00:24:47.595 { 00:24:47.595 "name": null, 00:24:47.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.595 "is_configured": false, 00:24:47.595 "data_offset": 0, 00:24:47.595 "data_size": 65536 00:24:47.595 }, 00:24:47.595 { 00:24:47.595 "name": "BaseBdev3", 00:24:47.595 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:47.595 "is_configured": true, 00:24:47.595 "data_offset": 0, 00:24:47.595 "data_size": 65536 00:24:47.595 }, 00:24:47.595 { 00:24:47.595 "name": "BaseBdev4", 00:24:47.595 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:47.595 "is_configured": true, 00:24:47.595 "data_offset": 0, 00:24:47.595 "data_size": 65536 00:24:47.595 } 00:24:47.595 ] 00:24:47.595 }' 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.595 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.854 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:47.854 "name": "raid_bdev1", 00:24:47.854 "uuid": "8f866d45-1c9a-421b-8b88-1d5e9f6026c1", 00:24:47.854 "strip_size_kb": 0, 00:24:47.854 "state": "online", 00:24:47.854 "raid_level": "raid1", 00:24:47.854 "superblock": false, 00:24:47.854 "num_base_bdevs": 4, 00:24:47.854 "num_base_bdevs_discovered": 3, 00:24:47.854 "num_base_bdevs_operational": 3, 00:24:47.854 "base_bdevs_list": [ 00:24:47.854 { 00:24:47.854 "name": "spare", 00:24:47.854 "uuid": "9aae0762-d0f1-596d-8140-e126d7bc5b2d", 00:24:47.854 "is_configured": true, 00:24:47.854 "data_offset": 0, 00:24:47.854 "data_size": 65536 00:24:47.854 }, 00:24:47.854 { 00:24:47.854 "name": null, 00:24:47.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.854 "is_configured": false, 00:24:47.854 "data_offset": 0, 00:24:47.854 "data_size": 65536 00:24:47.854 }, 00:24:47.854 { 00:24:47.854 "name": "BaseBdev3", 00:24:47.854 "uuid": "f99848a9-050c-5d4f-a8b1-f5f13b459e07", 00:24:47.854 "is_configured": true, 00:24:47.854 "data_offset": 0, 00:24:47.854 "data_size": 65536 00:24:47.854 }, 00:24:47.854 { 00:24:47.854 "name": "BaseBdev4", 00:24:47.855 "uuid": "72d20806-052c-5e28-9296-cddd7ddfab19", 00:24:47.855 "is_configured": true, 00:24:47.855 "data_offset": 0, 00:24:47.855 "data_size": 65536 00:24:47.855 } 00:24:47.855 ] 00:24:47.855 }' 00:24:47.855 09:28:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:47.855 09:28:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:48.422 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:48.681 [2024-07-15 09:28:57.399077] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:48.681 [2024-07-15 09:28:57.399106] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:48.681 [2024-07-15 09:28:57.399162] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:48.681 [2024-07-15 09:28:57.399238] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:48.681 [2024-07-15 09:28:57.399249] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f6f8a0 name raid_bdev1, state offline 00:24:48.681 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:48.681 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:48.941 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:49.200 /dev/nbd0 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:49.200 1+0 records in 00:24:49.200 1+0 records out 00:24:49.200 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000161332 s, 25.4 MB/s 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:49.200 09:28:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:49.459 /dev/nbd1 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:49.459 1+0 records in 00:24:49.459 1+0 records out 00:24:49.459 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288965 s, 14.2 MB/s 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:49.459 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:49.718 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:49.977 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:49.977 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:49.977 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:49.977 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 206320 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 206320 ']' 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 206320 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:49.978 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 206320 00:24:50.237 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:50.237 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:50.237 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 206320' 00:24:50.237 killing process with pid 206320 00:24:50.237 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 206320 00:24:50.237 Received shutdown signal, test time was about 60.000000 seconds 00:24:50.237 00:24:50.237 Latency(us) 00:24:50.237 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:50.237 =================================================================================================================== 00:24:50.237 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:50.237 [2024-07-15 09:28:58.949308] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:50.237 09:28:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 206320 00:24:50.237 [2024-07-15 09:28:58.997840] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:50.496 00:24:50.496 real 0m25.025s 00:24:50.496 user 0m34.060s 00:24:50.496 sys 0m5.208s 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:50.496 ************************************ 00:24:50.496 END TEST raid_rebuild_test 00:24:50.496 ************************************ 00:24:50.496 09:28:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:50.496 09:28:59 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:50.496 09:28:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:50.496 09:28:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:50.496 09:28:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:50.496 ************************************ 00:24:50.496 START TEST raid_rebuild_test_sb 00:24:50.496 ************************************ 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:50.496 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=209728 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 209728 /var/tmp/spdk-raid.sock 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 209728 ']' 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:50.497 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:50.497 09:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:50.497 [2024-07-15 09:28:59.386145] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:24:50.497 [2024-07-15 09:28:59.386217] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid209728 ] 00:24:50.497 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:50.497 Zero copy mechanism will not be used. 00:24:50.756 [2024-07-15 09:28:59.516017] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:50.756 [2024-07-15 09:28:59.623542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:50.756 [2024-07-15 09:28:59.686785] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:50.756 [2024-07-15 09:28:59.686818] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:51.693 09:29:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:51.693 09:29:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:51.693 09:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:51.693 09:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:51.693 BaseBdev1_malloc 00:24:51.693 09:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:51.952 [2024-07-15 09:29:00.796409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:51.952 [2024-07-15 09:29:00.796458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.952 [2024-07-15 09:29:00.796479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c45d40 00:24:51.952 [2024-07-15 09:29:00.796492] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.952 [2024-07-15 09:29:00.798075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.952 [2024-07-15 09:29:00.798106] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:51.952 BaseBdev1 00:24:51.952 09:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:51.952 09:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:52.211 BaseBdev2_malloc 00:24:52.211 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:52.470 [2024-07-15 09:29:01.286562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:52.470 [2024-07-15 09:29:01.286610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.470 [2024-07-15 09:29:01.286633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c46860 00:24:52.470 [2024-07-15 09:29:01.286647] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.470 [2024-07-15 09:29:01.288127] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.470 [2024-07-15 09:29:01.288157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:52.470 BaseBdev2 00:24:52.470 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:52.470 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:52.729 BaseBdev3_malloc 00:24:52.729 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:52.988 [2024-07-15 09:29:01.784531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:52.988 [2024-07-15 09:29:01.784583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.988 [2024-07-15 09:29:01.784604] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df38f0 00:24:52.988 [2024-07-15 09:29:01.784617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.988 [2024-07-15 09:29:01.786098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.988 [2024-07-15 09:29:01.786129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:52.989 BaseBdev3 00:24:52.989 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:52.989 09:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:53.248 BaseBdev4_malloc 00:24:53.248 09:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:53.507 [2024-07-15 09:29:02.278432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:53.507 [2024-07-15 09:29:02.278479] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:53.507 [2024-07-15 09:29:02.278499] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df2ad0 00:24:53.507 [2024-07-15 09:29:02.278512] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:53.507 [2024-07-15 09:29:02.279924] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:53.507 [2024-07-15 09:29:02.279961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:53.507 BaseBdev4 00:24:53.507 09:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:53.766 spare_malloc 00:24:53.766 09:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:54.067 spare_delay 00:24:54.067 09:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:54.326 [2024-07-15 09:29:03.020990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:54.326 [2024-07-15 09:29:03.021039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.326 [2024-07-15 09:29:03.021059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df75b0 00:24:54.326 [2024-07-15 09:29:03.021072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.326 [2024-07-15 09:29:03.022506] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.326 [2024-07-15 09:29:03.022536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:54.326 spare 00:24:54.326 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:54.326 [2024-07-15 09:29:03.261663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:54.326 [2024-07-15 09:29:03.262971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:54.326 [2024-07-15 09:29:03.263028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:54.326 [2024-07-15 09:29:03.263074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:54.326 [2024-07-15 09:29:03.263271] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d768a0 00:24:54.326 [2024-07-15 09:29:03.263283] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:54.326 [2024-07-15 09:29:03.263487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df0e10 00:24:54.326 [2024-07-15 09:29:03.263641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d768a0 00:24:54.326 [2024-07-15 09:29:03.263651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d768a0 00:24:54.326 [2024-07-15 09:29:03.263746] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.584 "name": "raid_bdev1", 00:24:54.584 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:24:54.584 "strip_size_kb": 0, 00:24:54.584 "state": "online", 00:24:54.584 "raid_level": "raid1", 00:24:54.584 "superblock": true, 00:24:54.584 "num_base_bdevs": 4, 00:24:54.584 "num_base_bdevs_discovered": 4, 00:24:54.584 "num_base_bdevs_operational": 4, 00:24:54.584 "base_bdevs_list": [ 00:24:54.584 { 00:24:54.584 "name": "BaseBdev1", 00:24:54.584 "uuid": "9e3e9bf5-4a79-5f9d-8e03-712b4b13a101", 00:24:54.584 "is_configured": true, 00:24:54.584 "data_offset": 2048, 00:24:54.584 "data_size": 63488 00:24:54.584 }, 00:24:54.584 { 00:24:54.584 "name": "BaseBdev2", 00:24:54.584 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:24:54.584 "is_configured": true, 00:24:54.584 "data_offset": 2048, 00:24:54.584 "data_size": 63488 00:24:54.584 }, 00:24:54.584 { 00:24:54.584 "name": "BaseBdev3", 00:24:54.584 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:24:54.584 "is_configured": true, 00:24:54.584 "data_offset": 2048, 00:24:54.584 "data_size": 63488 00:24:54.584 }, 00:24:54.584 { 00:24:54.584 "name": "BaseBdev4", 00:24:54.584 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:24:54.584 "is_configured": true, 00:24:54.584 "data_offset": 2048, 00:24:54.584 "data_size": 63488 00:24:54.584 } 00:24:54.584 ] 00:24:54.584 }' 00:24:54.584 09:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.842 09:29:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:55.408 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:55.408 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:55.408 [2024-07-15 09:29:04.332756] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:55.408 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:55.408 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.408 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:55.667 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:55.926 [2024-07-15 09:29:04.841835] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df0e10 00:24:55.926 /dev/nbd0 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:55.926 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:56.186 1+0 records in 00:24:56.186 1+0 records out 00:24:56.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265388 s, 15.4 MB/s 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:56.186 09:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:04.300 63488+0 records in 00:25:04.300 63488+0 records out 00:25:04.300 32505856 bytes (33 MB, 31 MiB) copied, 7.49216 s, 4.3 MB/s 00:25:04.300 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:04.300 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.301 [2024-07-15 09:29:12.603854] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:04.301 [2024-07-15 09:29:12.832515] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.301 09:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.301 09:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.301 "name": "raid_bdev1", 00:25:04.301 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:04.301 "strip_size_kb": 0, 00:25:04.301 "state": "online", 00:25:04.301 "raid_level": "raid1", 00:25:04.301 "superblock": true, 00:25:04.301 "num_base_bdevs": 4, 00:25:04.301 "num_base_bdevs_discovered": 3, 00:25:04.301 "num_base_bdevs_operational": 3, 00:25:04.301 "base_bdevs_list": [ 00:25:04.301 { 00:25:04.301 "name": null, 00:25:04.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.301 "is_configured": false, 00:25:04.301 "data_offset": 2048, 00:25:04.301 "data_size": 63488 00:25:04.301 }, 00:25:04.301 { 00:25:04.301 "name": "BaseBdev2", 00:25:04.301 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:25:04.301 "is_configured": true, 00:25:04.301 "data_offset": 2048, 00:25:04.301 "data_size": 63488 00:25:04.301 }, 00:25:04.301 { 00:25:04.301 "name": "BaseBdev3", 00:25:04.301 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:04.301 "is_configured": true, 00:25:04.301 "data_offset": 2048, 00:25:04.301 "data_size": 63488 00:25:04.301 }, 00:25:04.301 { 00:25:04.301 "name": "BaseBdev4", 00:25:04.301 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:04.301 "is_configured": true, 00:25:04.301 "data_offset": 2048, 00:25:04.301 "data_size": 63488 00:25:04.301 } 00:25:04.301 ] 00:25:04.301 }' 00:25:04.301 09:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.301 09:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:04.866 09:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:05.123 [2024-07-15 09:29:13.859242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.123 [2024-07-15 09:29:13.863359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df0e10 00:25:05.123 [2024-07-15 09:29:13.865726] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.123 09:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.057 09:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.316 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.316 "name": "raid_bdev1", 00:25:06.316 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:06.316 "strip_size_kb": 0, 00:25:06.316 "state": "online", 00:25:06.316 "raid_level": "raid1", 00:25:06.316 "superblock": true, 00:25:06.316 "num_base_bdevs": 4, 00:25:06.316 "num_base_bdevs_discovered": 4, 00:25:06.316 "num_base_bdevs_operational": 4, 00:25:06.316 "process": { 00:25:06.316 "type": "rebuild", 00:25:06.316 "target": "spare", 00:25:06.316 "progress": { 00:25:06.316 "blocks": 24576, 00:25:06.316 "percent": 38 00:25:06.316 } 00:25:06.317 }, 00:25:06.317 "base_bdevs_list": [ 00:25:06.317 { 00:25:06.317 "name": "spare", 00:25:06.317 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:06.317 "is_configured": true, 00:25:06.317 "data_offset": 2048, 00:25:06.317 "data_size": 63488 00:25:06.317 }, 00:25:06.317 { 00:25:06.317 "name": "BaseBdev2", 00:25:06.317 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:25:06.317 "is_configured": true, 00:25:06.317 "data_offset": 2048, 00:25:06.317 "data_size": 63488 00:25:06.317 }, 00:25:06.317 { 00:25:06.317 "name": "BaseBdev3", 00:25:06.317 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:06.317 "is_configured": true, 00:25:06.317 "data_offset": 2048, 00:25:06.317 "data_size": 63488 00:25:06.317 }, 00:25:06.317 { 00:25:06.317 "name": "BaseBdev4", 00:25:06.317 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:06.317 "is_configured": true, 00:25:06.317 "data_offset": 2048, 00:25:06.317 "data_size": 63488 00:25:06.317 } 00:25:06.317 ] 00:25:06.317 }' 00:25:06.317 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.317 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.317 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.317 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.317 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:06.576 [2024-07-15 09:29:15.452613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.576 [2024-07-15 09:29:15.478138] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:06.576 [2024-07-15 09:29:15.478183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.576 [2024-07-15 09:29:15.478201] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.576 [2024-07-15 09:29:15.478210] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.576 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.834 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.834 "name": "raid_bdev1", 00:25:06.834 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:06.834 "strip_size_kb": 0, 00:25:06.834 "state": "online", 00:25:06.834 "raid_level": "raid1", 00:25:06.834 "superblock": true, 00:25:06.834 "num_base_bdevs": 4, 00:25:06.834 "num_base_bdevs_discovered": 3, 00:25:06.834 "num_base_bdevs_operational": 3, 00:25:06.834 "base_bdevs_list": [ 00:25:06.834 { 00:25:06.834 "name": null, 00:25:06.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.834 "is_configured": false, 00:25:06.834 "data_offset": 2048, 00:25:06.834 "data_size": 63488 00:25:06.834 }, 00:25:06.834 { 00:25:06.834 "name": "BaseBdev2", 00:25:06.834 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:25:06.834 "is_configured": true, 00:25:06.834 "data_offset": 2048, 00:25:06.834 "data_size": 63488 00:25:06.834 }, 00:25:06.834 { 00:25:06.834 "name": "BaseBdev3", 00:25:06.834 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:06.834 "is_configured": true, 00:25:06.834 "data_offset": 2048, 00:25:06.834 "data_size": 63488 00:25:06.834 }, 00:25:06.834 { 00:25:06.834 "name": "BaseBdev4", 00:25:06.834 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:06.834 "is_configured": true, 00:25:06.834 "data_offset": 2048, 00:25:06.834 "data_size": 63488 00:25:06.834 } 00:25:06.834 ] 00:25:06.834 }' 00:25:06.834 09:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.834 09:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.767 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.767 "name": "raid_bdev1", 00:25:07.767 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:07.767 "strip_size_kb": 0, 00:25:07.767 "state": "online", 00:25:07.767 "raid_level": "raid1", 00:25:07.767 "superblock": true, 00:25:07.767 "num_base_bdevs": 4, 00:25:07.767 "num_base_bdevs_discovered": 3, 00:25:07.767 "num_base_bdevs_operational": 3, 00:25:07.767 "base_bdevs_list": [ 00:25:07.767 { 00:25:07.767 "name": null, 00:25:07.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.768 "is_configured": false, 00:25:07.768 "data_offset": 2048, 00:25:07.768 "data_size": 63488 00:25:07.768 }, 00:25:07.768 { 00:25:07.768 "name": "BaseBdev2", 00:25:07.768 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:25:07.768 "is_configured": true, 00:25:07.768 "data_offset": 2048, 00:25:07.768 "data_size": 63488 00:25:07.768 }, 00:25:07.768 { 00:25:07.768 "name": "BaseBdev3", 00:25:07.768 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:07.768 "is_configured": true, 00:25:07.768 "data_offset": 2048, 00:25:07.768 "data_size": 63488 00:25:07.768 }, 00:25:07.768 { 00:25:07.768 "name": "BaseBdev4", 00:25:07.768 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:07.768 "is_configured": true, 00:25:07.768 "data_offset": 2048, 00:25:07.768 "data_size": 63488 00:25:07.768 } 00:25:07.768 ] 00:25:07.768 }' 00:25:07.768 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.768 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.768 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.768 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:07.768 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.026 [2024-07-15 09:29:16.934665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.026 [2024-07-15 09:29:16.939347] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d76e90 00:25:08.026 [2024-07-15 09:29:16.940901] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.026 09:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.402 09:29:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.402 "name": "raid_bdev1", 00:25:09.402 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:09.402 "strip_size_kb": 0, 00:25:09.402 "state": "online", 00:25:09.402 "raid_level": "raid1", 00:25:09.402 "superblock": true, 00:25:09.402 "num_base_bdevs": 4, 00:25:09.402 "num_base_bdevs_discovered": 4, 00:25:09.402 "num_base_bdevs_operational": 4, 00:25:09.402 "process": { 00:25:09.402 "type": "rebuild", 00:25:09.402 "target": "spare", 00:25:09.402 "progress": { 00:25:09.402 "blocks": 24576, 00:25:09.402 "percent": 38 00:25:09.402 } 00:25:09.402 }, 00:25:09.402 "base_bdevs_list": [ 00:25:09.402 { 00:25:09.402 "name": "spare", 00:25:09.402 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:09.402 "is_configured": true, 00:25:09.402 "data_offset": 2048, 00:25:09.402 "data_size": 63488 00:25:09.402 }, 00:25:09.402 { 00:25:09.402 "name": "BaseBdev2", 00:25:09.402 "uuid": "fb12b839-00dc-5585-8f42-1136728c89e8", 00:25:09.402 "is_configured": true, 00:25:09.402 "data_offset": 2048, 00:25:09.402 "data_size": 63488 00:25:09.402 }, 00:25:09.402 { 00:25:09.402 "name": "BaseBdev3", 00:25:09.402 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:09.402 "is_configured": true, 00:25:09.402 "data_offset": 2048, 00:25:09.402 "data_size": 63488 00:25:09.402 }, 00:25:09.402 { 00:25:09.402 "name": "BaseBdev4", 00:25:09.402 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:09.402 "is_configured": true, 00:25:09.402 "data_offset": 2048, 00:25:09.402 "data_size": 63488 00:25:09.402 } 00:25:09.402 ] 00:25:09.402 }' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:09.402 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:09.402 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:09.661 [2024-07-15 09:29:18.516894] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:09.920 [2024-07-15 09:29:18.653763] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1d76e90 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.920 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.178 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.178 "name": "raid_bdev1", 00:25:10.178 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:10.178 "strip_size_kb": 0, 00:25:10.178 "state": "online", 00:25:10.178 "raid_level": "raid1", 00:25:10.178 "superblock": true, 00:25:10.178 "num_base_bdevs": 4, 00:25:10.178 "num_base_bdevs_discovered": 3, 00:25:10.178 "num_base_bdevs_operational": 3, 00:25:10.178 "process": { 00:25:10.178 "type": "rebuild", 00:25:10.178 "target": "spare", 00:25:10.178 "progress": { 00:25:10.178 "blocks": 36864, 00:25:10.178 "percent": 58 00:25:10.178 } 00:25:10.178 }, 00:25:10.178 "base_bdevs_list": [ 00:25:10.178 { 00:25:10.178 "name": "spare", 00:25:10.178 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:10.178 "is_configured": true, 00:25:10.178 "data_offset": 2048, 00:25:10.178 "data_size": 63488 00:25:10.178 }, 00:25:10.178 { 00:25:10.178 "name": null, 00:25:10.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.178 "is_configured": false, 00:25:10.178 "data_offset": 2048, 00:25:10.178 "data_size": 63488 00:25:10.178 }, 00:25:10.178 { 00:25:10.178 "name": "BaseBdev3", 00:25:10.178 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:10.178 "is_configured": true, 00:25:10.178 "data_offset": 2048, 00:25:10.178 "data_size": 63488 00:25:10.178 }, 00:25:10.178 { 00:25:10.178 "name": "BaseBdev4", 00:25:10.178 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:10.178 "is_configured": true, 00:25:10.178 "data_offset": 2048, 00:25:10.178 "data_size": 63488 00:25:10.178 } 00:25:10.178 ] 00:25:10.178 }' 00:25:10.178 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.178 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.178 09:29:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=903 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.178 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.437 "name": "raid_bdev1", 00:25:10.437 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:10.437 "strip_size_kb": 0, 00:25:10.437 "state": "online", 00:25:10.437 "raid_level": "raid1", 00:25:10.437 "superblock": true, 00:25:10.437 "num_base_bdevs": 4, 00:25:10.437 "num_base_bdevs_discovered": 3, 00:25:10.437 "num_base_bdevs_operational": 3, 00:25:10.437 "process": { 00:25:10.437 "type": "rebuild", 00:25:10.437 "target": "spare", 00:25:10.437 "progress": { 00:25:10.437 "blocks": 43008, 00:25:10.437 "percent": 67 00:25:10.437 } 00:25:10.437 }, 00:25:10.437 "base_bdevs_list": [ 00:25:10.437 { 00:25:10.437 "name": "spare", 00:25:10.437 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:10.437 "is_configured": true, 00:25:10.437 "data_offset": 2048, 00:25:10.437 "data_size": 63488 00:25:10.437 }, 00:25:10.437 { 00:25:10.437 "name": null, 00:25:10.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.437 "is_configured": false, 00:25:10.437 "data_offset": 2048, 00:25:10.437 "data_size": 63488 00:25:10.437 }, 00:25:10.437 { 00:25:10.437 "name": "BaseBdev3", 00:25:10.437 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:10.437 "is_configured": true, 00:25:10.437 "data_offset": 2048, 00:25:10.437 "data_size": 63488 00:25:10.437 }, 00:25:10.437 { 00:25:10.437 "name": "BaseBdev4", 00:25:10.437 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:10.437 "is_configured": true, 00:25:10.437 "data_offset": 2048, 00:25:10.437 "data_size": 63488 00:25:10.437 } 00:25:10.437 ] 00:25:10.437 }' 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.437 09:29:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:11.371 [2024-07-15 09:29:20.165613] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:11.371 [2024-07-15 09:29:20.165678] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:11.371 [2024-07-15 09:29:20.165776] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.629 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.888 "name": "raid_bdev1", 00:25:11.888 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:11.888 "strip_size_kb": 0, 00:25:11.888 "state": "online", 00:25:11.888 "raid_level": "raid1", 00:25:11.888 "superblock": true, 00:25:11.888 "num_base_bdevs": 4, 00:25:11.888 "num_base_bdevs_discovered": 3, 00:25:11.888 "num_base_bdevs_operational": 3, 00:25:11.888 "base_bdevs_list": [ 00:25:11.888 { 00:25:11.888 "name": "spare", 00:25:11.888 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:11.888 "is_configured": true, 00:25:11.888 "data_offset": 2048, 00:25:11.888 "data_size": 63488 00:25:11.888 }, 00:25:11.888 { 00:25:11.888 "name": null, 00:25:11.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:11.888 "is_configured": false, 00:25:11.888 "data_offset": 2048, 00:25:11.888 "data_size": 63488 00:25:11.888 }, 00:25:11.888 { 00:25:11.888 "name": "BaseBdev3", 00:25:11.888 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:11.888 "is_configured": true, 00:25:11.888 "data_offset": 2048, 00:25:11.888 "data_size": 63488 00:25:11.888 }, 00:25:11.888 { 00:25:11.888 "name": "BaseBdev4", 00:25:11.888 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:11.888 "is_configured": true, 00:25:11.888 "data_offset": 2048, 00:25:11.888 "data_size": 63488 00:25:11.888 } 00:25:11.888 ] 00:25:11.888 }' 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.888 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.147 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.147 "name": "raid_bdev1", 00:25:12.147 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:12.147 "strip_size_kb": 0, 00:25:12.147 "state": "online", 00:25:12.147 "raid_level": "raid1", 00:25:12.147 "superblock": true, 00:25:12.147 "num_base_bdevs": 4, 00:25:12.147 "num_base_bdevs_discovered": 3, 00:25:12.147 "num_base_bdevs_operational": 3, 00:25:12.147 "base_bdevs_list": [ 00:25:12.147 { 00:25:12.147 "name": "spare", 00:25:12.147 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:12.147 "is_configured": true, 00:25:12.147 "data_offset": 2048, 00:25:12.147 "data_size": 63488 00:25:12.147 }, 00:25:12.147 { 00:25:12.147 "name": null, 00:25:12.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.147 "is_configured": false, 00:25:12.147 "data_offset": 2048, 00:25:12.147 "data_size": 63488 00:25:12.147 }, 00:25:12.147 { 00:25:12.147 "name": "BaseBdev3", 00:25:12.147 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:12.147 "is_configured": true, 00:25:12.147 "data_offset": 2048, 00:25:12.147 "data_size": 63488 00:25:12.147 }, 00:25:12.147 { 00:25:12.147 "name": "BaseBdev4", 00:25:12.147 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:12.147 "is_configured": true, 00:25:12.147 "data_offset": 2048, 00:25:12.147 "data_size": 63488 00:25:12.147 } 00:25:12.147 ] 00:25:12.147 }' 00:25:12.147 09:29:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.147 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.406 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.406 "name": "raid_bdev1", 00:25:12.406 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:12.406 "strip_size_kb": 0, 00:25:12.406 "state": "online", 00:25:12.406 "raid_level": "raid1", 00:25:12.406 "superblock": true, 00:25:12.406 "num_base_bdevs": 4, 00:25:12.406 "num_base_bdevs_discovered": 3, 00:25:12.406 "num_base_bdevs_operational": 3, 00:25:12.406 "base_bdevs_list": [ 00:25:12.406 { 00:25:12.406 "name": "spare", 00:25:12.406 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:12.406 "is_configured": true, 00:25:12.406 "data_offset": 2048, 00:25:12.406 "data_size": 63488 00:25:12.406 }, 00:25:12.406 { 00:25:12.406 "name": null, 00:25:12.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.406 "is_configured": false, 00:25:12.406 "data_offset": 2048, 00:25:12.406 "data_size": 63488 00:25:12.406 }, 00:25:12.406 { 00:25:12.406 "name": "BaseBdev3", 00:25:12.406 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:12.406 "is_configured": true, 00:25:12.406 "data_offset": 2048, 00:25:12.406 "data_size": 63488 00:25:12.406 }, 00:25:12.406 { 00:25:12.406 "name": "BaseBdev4", 00:25:12.406 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:12.406 "is_configured": true, 00:25:12.406 "data_offset": 2048, 00:25:12.406 "data_size": 63488 00:25:12.406 } 00:25:12.406 ] 00:25:12.406 }' 00:25:12.406 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.406 09:29:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:12.972 09:29:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:13.230 [2024-07-15 09:29:22.062946] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:13.230 [2024-07-15 09:29:22.062977] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:13.230 [2024-07-15 09:29:22.063045] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:13.230 [2024-07-15 09:29:22.063121] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:13.230 [2024-07-15 09:29:22.063133] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d768a0 name raid_bdev1, state offline 00:25:13.230 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.230 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:13.488 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:13.489 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:13.747 /dev/nbd0 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:13.747 1+0 records in 00:25:13.747 1+0 records out 00:25:13.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229746 s, 17.8 MB/s 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:13.747 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:14.005 /dev/nbd1 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:14.005 1+0 records in 00:25:14.005 1+0 records out 00:25:14.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030237 s, 13.5 MB/s 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:14.005 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.264 09:29:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:14.524 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:14.781 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:14.782 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:14.782 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:14.782 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:15.039 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:15.039 [2024-07-15 09:29:23.971495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:15.039 [2024-07-15 09:29:23.971543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.039 [2024-07-15 09:29:23.971564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df0b40 00:25:15.039 [2024-07-15 09:29:23.971577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.039 [2024-07-15 09:29:23.973230] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.039 [2024-07-15 09:29:23.973262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:15.039 [2024-07-15 09:29:23.973345] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:15.039 [2024-07-15 09:29:23.973374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:15.039 [2024-07-15 09:29:23.973481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:15.039 [2024-07-15 09:29:23.973554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:15.039 spare 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.297 09:29:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.297 [2024-07-15 09:29:24.073871] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d7aba0 00:25:15.297 [2024-07-15 09:29:24.073889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:15.297 [2024-07-15 09:29:24.074095] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d77560 00:25:15.297 [2024-07-15 09:29:24.074249] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d7aba0 00:25:15.297 [2024-07-15 09:29:24.074260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d7aba0 00:25:15.297 [2024-07-15 09:29:24.074362] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:15.297 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.297 "name": "raid_bdev1", 00:25:15.297 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:15.297 "strip_size_kb": 0, 00:25:15.297 "state": "online", 00:25:15.297 "raid_level": "raid1", 00:25:15.297 "superblock": true, 00:25:15.297 "num_base_bdevs": 4, 00:25:15.297 "num_base_bdevs_discovered": 3, 00:25:15.297 "num_base_bdevs_operational": 3, 00:25:15.297 "base_bdevs_list": [ 00:25:15.297 { 00:25:15.297 "name": "spare", 00:25:15.297 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:15.297 "is_configured": true, 00:25:15.297 "data_offset": 2048, 00:25:15.297 "data_size": 63488 00:25:15.297 }, 00:25:15.297 { 00:25:15.297 "name": null, 00:25:15.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.297 "is_configured": false, 00:25:15.297 "data_offset": 2048, 00:25:15.297 "data_size": 63488 00:25:15.297 }, 00:25:15.297 { 00:25:15.297 "name": "BaseBdev3", 00:25:15.297 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:15.297 "is_configured": true, 00:25:15.297 "data_offset": 2048, 00:25:15.297 "data_size": 63488 00:25:15.297 }, 00:25:15.297 { 00:25:15.297 "name": "BaseBdev4", 00:25:15.297 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:15.297 "is_configured": true, 00:25:15.297 "data_offset": 2048, 00:25:15.297 "data_size": 63488 00:25:15.297 } 00:25:15.297 ] 00:25:15.297 }' 00:25:15.297 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.297 09:29:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.232 09:29:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.232 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.232 "name": "raid_bdev1", 00:25:16.232 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:16.232 "strip_size_kb": 0, 00:25:16.232 "state": "online", 00:25:16.232 "raid_level": "raid1", 00:25:16.232 "superblock": true, 00:25:16.232 "num_base_bdevs": 4, 00:25:16.232 "num_base_bdevs_discovered": 3, 00:25:16.232 "num_base_bdevs_operational": 3, 00:25:16.232 "base_bdevs_list": [ 00:25:16.232 { 00:25:16.232 "name": "spare", 00:25:16.232 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:16.232 "is_configured": true, 00:25:16.232 "data_offset": 2048, 00:25:16.232 "data_size": 63488 00:25:16.232 }, 00:25:16.232 { 00:25:16.232 "name": null, 00:25:16.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.232 "is_configured": false, 00:25:16.232 "data_offset": 2048, 00:25:16.232 "data_size": 63488 00:25:16.232 }, 00:25:16.232 { 00:25:16.232 "name": "BaseBdev3", 00:25:16.232 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:16.232 "is_configured": true, 00:25:16.232 "data_offset": 2048, 00:25:16.232 "data_size": 63488 00:25:16.232 }, 00:25:16.232 { 00:25:16.232 "name": "BaseBdev4", 00:25:16.232 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:16.232 "is_configured": true, 00:25:16.232 "data_offset": 2048, 00:25:16.232 "data_size": 63488 00:25:16.232 } 00:25:16.232 ] 00:25:16.232 }' 00:25:16.232 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.232 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:16.232 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.510 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:16.510 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.510 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:16.510 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:16.510 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:16.776 [2024-07-15 09:29:25.668135] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:16.776 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:16.776 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.776 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.777 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.034 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.034 "name": "raid_bdev1", 00:25:17.034 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:17.034 "strip_size_kb": 0, 00:25:17.034 "state": "online", 00:25:17.034 "raid_level": "raid1", 00:25:17.034 "superblock": true, 00:25:17.034 "num_base_bdevs": 4, 00:25:17.034 "num_base_bdevs_discovered": 2, 00:25:17.034 "num_base_bdevs_operational": 2, 00:25:17.034 "base_bdevs_list": [ 00:25:17.034 { 00:25:17.034 "name": null, 00:25:17.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.034 "is_configured": false, 00:25:17.034 "data_offset": 2048, 00:25:17.034 "data_size": 63488 00:25:17.034 }, 00:25:17.034 { 00:25:17.034 "name": null, 00:25:17.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.034 "is_configured": false, 00:25:17.034 "data_offset": 2048, 00:25:17.034 "data_size": 63488 00:25:17.034 }, 00:25:17.034 { 00:25:17.034 "name": "BaseBdev3", 00:25:17.034 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:17.034 "is_configured": true, 00:25:17.034 "data_offset": 2048, 00:25:17.035 "data_size": 63488 00:25:17.035 }, 00:25:17.035 { 00:25:17.035 "name": "BaseBdev4", 00:25:17.035 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:17.035 "is_configured": true, 00:25:17.035 "data_offset": 2048, 00:25:17.035 "data_size": 63488 00:25:17.035 } 00:25:17.035 ] 00:25:17.035 }' 00:25:17.035 09:29:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.035 09:29:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:17.601 09:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:17.860 [2024-07-15 09:29:26.747014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:17.860 [2024-07-15 09:29:26.747169] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:17.860 [2024-07-15 09:29:26.747185] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:17.860 [2024-07-15 09:29:26.747215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:17.860 [2024-07-15 09:29:26.751225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d7a740 00:25:17.860 [2024-07-15 09:29:26.753578] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:17.860 09:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.235 09:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.235 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.235 "name": "raid_bdev1", 00:25:19.235 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:19.235 "strip_size_kb": 0, 00:25:19.235 "state": "online", 00:25:19.235 "raid_level": "raid1", 00:25:19.235 "superblock": true, 00:25:19.235 "num_base_bdevs": 4, 00:25:19.235 "num_base_bdevs_discovered": 3, 00:25:19.235 "num_base_bdevs_operational": 3, 00:25:19.235 "process": { 00:25:19.235 "type": "rebuild", 00:25:19.235 "target": "spare", 00:25:19.235 "progress": { 00:25:19.235 "blocks": 24576, 00:25:19.235 "percent": 38 00:25:19.235 } 00:25:19.235 }, 00:25:19.235 "base_bdevs_list": [ 00:25:19.235 { 00:25:19.235 "name": "spare", 00:25:19.235 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:19.235 "is_configured": true, 00:25:19.235 "data_offset": 2048, 00:25:19.235 "data_size": 63488 00:25:19.235 }, 00:25:19.235 { 00:25:19.235 "name": null, 00:25:19.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.235 "is_configured": false, 00:25:19.235 "data_offset": 2048, 00:25:19.235 "data_size": 63488 00:25:19.235 }, 00:25:19.235 { 00:25:19.235 "name": "BaseBdev3", 00:25:19.235 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:19.235 "is_configured": true, 00:25:19.235 "data_offset": 2048, 00:25:19.235 "data_size": 63488 00:25:19.235 }, 00:25:19.235 { 00:25:19.235 "name": "BaseBdev4", 00:25:19.235 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:19.235 "is_configured": true, 00:25:19.235 "data_offset": 2048, 00:25:19.235 "data_size": 63488 00:25:19.235 } 00:25:19.235 ] 00:25:19.235 }' 00:25:19.235 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.235 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:19.235 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:19.235 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:19.236 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:19.494 [2024-07-15 09:29:28.332817] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:19.494 [2024-07-15 09:29:28.366332] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:19.494 [2024-07-15 09:29:28.366376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.494 [2024-07-15 09:29:28.366392] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:19.494 [2024-07-15 09:29:28.366401] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.494 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.753 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:19.753 "name": "raid_bdev1", 00:25:19.753 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:19.753 "strip_size_kb": 0, 00:25:19.753 "state": "online", 00:25:19.753 "raid_level": "raid1", 00:25:19.753 "superblock": true, 00:25:19.753 "num_base_bdevs": 4, 00:25:19.753 "num_base_bdevs_discovered": 2, 00:25:19.753 "num_base_bdevs_operational": 2, 00:25:19.753 "base_bdevs_list": [ 00:25:19.753 { 00:25:19.753 "name": null, 00:25:19.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.753 "is_configured": false, 00:25:19.753 "data_offset": 2048, 00:25:19.753 "data_size": 63488 00:25:19.753 }, 00:25:19.753 { 00:25:19.753 "name": null, 00:25:19.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.753 "is_configured": false, 00:25:19.753 "data_offset": 2048, 00:25:19.753 "data_size": 63488 00:25:19.753 }, 00:25:19.753 { 00:25:19.753 "name": "BaseBdev3", 00:25:19.753 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:19.753 "is_configured": true, 00:25:19.753 "data_offset": 2048, 00:25:19.753 "data_size": 63488 00:25:19.753 }, 00:25:19.753 { 00:25:19.753 "name": "BaseBdev4", 00:25:19.753 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:19.753 "is_configured": true, 00:25:19.753 "data_offset": 2048, 00:25:19.753 "data_size": 63488 00:25:19.753 } 00:25:19.753 ] 00:25:19.753 }' 00:25:19.753 09:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:19.753 09:29:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:20.320 09:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:20.579 [2024-07-15 09:29:29.445817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:20.579 [2024-07-15 09:29:29.445870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.579 [2024-07-15 09:29:29.445892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d7b010 00:25:20.579 [2024-07-15 09:29:29.445906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.579 [2024-07-15 09:29:29.446297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.579 [2024-07-15 09:29:29.446318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:20.579 [2024-07-15 09:29:29.446400] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:20.579 [2024-07-15 09:29:29.446413] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:20.579 [2024-07-15 09:29:29.446425] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:20.579 [2024-07-15 09:29:29.446446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:20.580 [2024-07-15 09:29:29.450504] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df6420 00:25:20.580 spare 00:25:20.580 [2024-07-15 09:29:29.451995] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:20.580 09:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:21.959 "name": "raid_bdev1", 00:25:21.959 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:21.959 "strip_size_kb": 0, 00:25:21.959 "state": "online", 00:25:21.959 "raid_level": "raid1", 00:25:21.959 "superblock": true, 00:25:21.959 "num_base_bdevs": 4, 00:25:21.959 "num_base_bdevs_discovered": 3, 00:25:21.959 "num_base_bdevs_operational": 3, 00:25:21.959 "process": { 00:25:21.959 "type": "rebuild", 00:25:21.959 "target": "spare", 00:25:21.959 "progress": { 00:25:21.959 "blocks": 22528, 00:25:21.959 "percent": 35 00:25:21.959 } 00:25:21.959 }, 00:25:21.959 "base_bdevs_list": [ 00:25:21.959 { 00:25:21.959 "name": "spare", 00:25:21.959 "uuid": "9c41ef76-976d-5760-ab77-fd4e7fa702ad", 00:25:21.959 "is_configured": true, 00:25:21.959 "data_offset": 2048, 00:25:21.959 "data_size": 63488 00:25:21.959 }, 00:25:21.959 { 00:25:21.959 "name": null, 00:25:21.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:21.959 "is_configured": false, 00:25:21.959 "data_offset": 2048, 00:25:21.959 "data_size": 63488 00:25:21.959 }, 00:25:21.959 { 00:25:21.959 "name": "BaseBdev3", 00:25:21.959 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:21.959 "is_configured": true, 00:25:21.959 "data_offset": 2048, 00:25:21.959 "data_size": 63488 00:25:21.959 }, 00:25:21.959 { 00:25:21.959 "name": "BaseBdev4", 00:25:21.959 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:21.959 "is_configured": true, 00:25:21.959 "data_offset": 2048, 00:25:21.959 "data_size": 63488 00:25:21.959 } 00:25:21.959 ] 00:25:21.959 }' 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:21.959 09:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:22.218 [2024-07-15 09:29:30.964272] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:22.218 [2024-07-15 09:29:31.064560] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:22.218 [2024-07-15 09:29:31.064607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.218 [2024-07-15 09:29:31.064623] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:22.218 [2024-07-15 09:29:31.064632] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:22.218 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.219 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.478 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.478 "name": "raid_bdev1", 00:25:22.478 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:22.478 "strip_size_kb": 0, 00:25:22.478 "state": "online", 00:25:22.478 "raid_level": "raid1", 00:25:22.478 "superblock": true, 00:25:22.478 "num_base_bdevs": 4, 00:25:22.478 "num_base_bdevs_discovered": 2, 00:25:22.478 "num_base_bdevs_operational": 2, 00:25:22.478 "base_bdevs_list": [ 00:25:22.478 { 00:25:22.478 "name": null, 00:25:22.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.478 "is_configured": false, 00:25:22.478 "data_offset": 2048, 00:25:22.478 "data_size": 63488 00:25:22.478 }, 00:25:22.478 { 00:25:22.478 "name": null, 00:25:22.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.478 "is_configured": false, 00:25:22.478 "data_offset": 2048, 00:25:22.478 "data_size": 63488 00:25:22.478 }, 00:25:22.478 { 00:25:22.478 "name": "BaseBdev3", 00:25:22.478 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:22.478 "is_configured": true, 00:25:22.478 "data_offset": 2048, 00:25:22.478 "data_size": 63488 00:25:22.478 }, 00:25:22.478 { 00:25:22.478 "name": "BaseBdev4", 00:25:22.478 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:22.478 "is_configured": true, 00:25:22.478 "data_offset": 2048, 00:25:22.478 "data_size": 63488 00:25:22.478 } 00:25:22.478 ] 00:25:22.478 }' 00:25:22.478 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.479 09:29:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.047 09:29:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.306 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.306 "name": "raid_bdev1", 00:25:23.306 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:23.306 "strip_size_kb": 0, 00:25:23.306 "state": "online", 00:25:23.306 "raid_level": "raid1", 00:25:23.306 "superblock": true, 00:25:23.306 "num_base_bdevs": 4, 00:25:23.306 "num_base_bdevs_discovered": 2, 00:25:23.306 "num_base_bdevs_operational": 2, 00:25:23.306 "base_bdevs_list": [ 00:25:23.306 { 00:25:23.306 "name": null, 00:25:23.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.306 "is_configured": false, 00:25:23.306 "data_offset": 2048, 00:25:23.306 "data_size": 63488 00:25:23.306 }, 00:25:23.306 { 00:25:23.306 "name": null, 00:25:23.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.306 "is_configured": false, 00:25:23.306 "data_offset": 2048, 00:25:23.306 "data_size": 63488 00:25:23.306 }, 00:25:23.306 { 00:25:23.306 "name": "BaseBdev3", 00:25:23.306 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:23.306 "is_configured": true, 00:25:23.306 "data_offset": 2048, 00:25:23.306 "data_size": 63488 00:25:23.306 }, 00:25:23.307 { 00:25:23.307 "name": "BaseBdev4", 00:25:23.307 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:23.307 "is_configured": true, 00:25:23.307 "data_offset": 2048, 00:25:23.307 "data_size": 63488 00:25:23.307 } 00:25:23.307 ] 00:25:23.307 }' 00:25:23.307 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.307 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:23.307 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.566 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:23.566 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:23.566 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:23.824 [2024-07-15 09:29:32.724916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:23.824 [2024-07-15 09:29:32.724969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:23.824 [2024-07-15 09:29:32.724990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df6e30 00:25:23.824 [2024-07-15 09:29:32.725003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:23.824 [2024-07-15 09:29:32.725351] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:23.824 [2024-07-15 09:29:32.725372] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:23.824 [2024-07-15 09:29:32.725438] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:23.824 [2024-07-15 09:29:32.725450] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:23.824 [2024-07-15 09:29:32.725461] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:23.824 BaseBdev1 00:25:23.824 09:29:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.204 "name": "raid_bdev1", 00:25:25.204 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:25.204 "strip_size_kb": 0, 00:25:25.204 "state": "online", 00:25:25.204 "raid_level": "raid1", 00:25:25.204 "superblock": true, 00:25:25.204 "num_base_bdevs": 4, 00:25:25.204 "num_base_bdevs_discovered": 2, 00:25:25.204 "num_base_bdevs_operational": 2, 00:25:25.204 "base_bdevs_list": [ 00:25:25.204 { 00:25:25.204 "name": null, 00:25:25.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.204 "is_configured": false, 00:25:25.204 "data_offset": 2048, 00:25:25.204 "data_size": 63488 00:25:25.204 }, 00:25:25.204 { 00:25:25.204 "name": null, 00:25:25.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.204 "is_configured": false, 00:25:25.204 "data_offset": 2048, 00:25:25.204 "data_size": 63488 00:25:25.204 }, 00:25:25.204 { 00:25:25.204 "name": "BaseBdev3", 00:25:25.204 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:25.204 "is_configured": true, 00:25:25.204 "data_offset": 2048, 00:25:25.204 "data_size": 63488 00:25:25.204 }, 00:25:25.204 { 00:25:25.204 "name": "BaseBdev4", 00:25:25.204 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:25.204 "is_configured": true, 00:25:25.204 "data_offset": 2048, 00:25:25.204 "data_size": 63488 00:25:25.204 } 00:25:25.204 ] 00:25:25.204 }' 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.204 09:29:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.772 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.032 "name": "raid_bdev1", 00:25:26.032 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:26.032 "strip_size_kb": 0, 00:25:26.032 "state": "online", 00:25:26.032 "raid_level": "raid1", 00:25:26.032 "superblock": true, 00:25:26.032 "num_base_bdevs": 4, 00:25:26.032 "num_base_bdevs_discovered": 2, 00:25:26.032 "num_base_bdevs_operational": 2, 00:25:26.032 "base_bdevs_list": [ 00:25:26.032 { 00:25:26.032 "name": null, 00:25:26.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.032 "is_configured": false, 00:25:26.032 "data_offset": 2048, 00:25:26.032 "data_size": 63488 00:25:26.032 }, 00:25:26.032 { 00:25:26.032 "name": null, 00:25:26.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.032 "is_configured": false, 00:25:26.032 "data_offset": 2048, 00:25:26.032 "data_size": 63488 00:25:26.032 }, 00:25:26.032 { 00:25:26.032 "name": "BaseBdev3", 00:25:26.032 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:26.032 "is_configured": true, 00:25:26.032 "data_offset": 2048, 00:25:26.032 "data_size": 63488 00:25:26.032 }, 00:25:26.032 { 00:25:26.032 "name": "BaseBdev4", 00:25:26.032 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:26.032 "is_configured": true, 00:25:26.032 "data_offset": 2048, 00:25:26.032 "data_size": 63488 00:25:26.032 } 00:25:26.032 ] 00:25:26.032 }' 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:26.032 09:29:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:26.292 [2024-07-15 09:29:35.167636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:26.292 [2024-07-15 09:29:35.167777] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:26.292 [2024-07-15 09:29:35.167794] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:26.292 request: 00:25:26.292 { 00:25:26.292 "base_bdev": "BaseBdev1", 00:25:26.292 "raid_bdev": "raid_bdev1", 00:25:26.292 "method": "bdev_raid_add_base_bdev", 00:25:26.292 "req_id": 1 00:25:26.292 } 00:25:26.292 Got JSON-RPC error response 00:25:26.292 response: 00:25:26.292 { 00:25:26.292 "code": -22, 00:25:26.292 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:26.292 } 00:25:26.292 09:29:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:26.292 09:29:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:26.292 09:29:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:26.292 09:29:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:26.292 09:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.672 "name": "raid_bdev1", 00:25:27.672 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:27.672 "strip_size_kb": 0, 00:25:27.672 "state": "online", 00:25:27.672 "raid_level": "raid1", 00:25:27.672 "superblock": true, 00:25:27.672 "num_base_bdevs": 4, 00:25:27.672 "num_base_bdevs_discovered": 2, 00:25:27.672 "num_base_bdevs_operational": 2, 00:25:27.672 "base_bdevs_list": [ 00:25:27.672 { 00:25:27.672 "name": null, 00:25:27.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.672 "is_configured": false, 00:25:27.672 "data_offset": 2048, 00:25:27.672 "data_size": 63488 00:25:27.672 }, 00:25:27.672 { 00:25:27.672 "name": null, 00:25:27.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.672 "is_configured": false, 00:25:27.672 "data_offset": 2048, 00:25:27.672 "data_size": 63488 00:25:27.672 }, 00:25:27.672 { 00:25:27.672 "name": "BaseBdev3", 00:25:27.672 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:27.672 "is_configured": true, 00:25:27.672 "data_offset": 2048, 00:25:27.672 "data_size": 63488 00:25:27.672 }, 00:25:27.672 { 00:25:27.672 "name": "BaseBdev4", 00:25:27.672 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:27.672 "is_configured": true, 00:25:27.672 "data_offset": 2048, 00:25:27.672 "data_size": 63488 00:25:27.672 } 00:25:27.672 ] 00:25:27.672 }' 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.672 09:29:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.242 09:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.502 "name": "raid_bdev1", 00:25:28.502 "uuid": "a47bd7f1-fffb-4b95-b705-fa623e0dfb46", 00:25:28.502 "strip_size_kb": 0, 00:25:28.502 "state": "online", 00:25:28.502 "raid_level": "raid1", 00:25:28.502 "superblock": true, 00:25:28.502 "num_base_bdevs": 4, 00:25:28.502 "num_base_bdevs_discovered": 2, 00:25:28.502 "num_base_bdevs_operational": 2, 00:25:28.502 "base_bdevs_list": [ 00:25:28.502 { 00:25:28.502 "name": null, 00:25:28.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.502 "is_configured": false, 00:25:28.502 "data_offset": 2048, 00:25:28.502 "data_size": 63488 00:25:28.502 }, 00:25:28.502 { 00:25:28.502 "name": null, 00:25:28.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.502 "is_configured": false, 00:25:28.502 "data_offset": 2048, 00:25:28.502 "data_size": 63488 00:25:28.502 }, 00:25:28.502 { 00:25:28.502 "name": "BaseBdev3", 00:25:28.502 "uuid": "d002f998-8ddd-53ef-9e0b-a5f024cd6e72", 00:25:28.502 "is_configured": true, 00:25:28.502 "data_offset": 2048, 00:25:28.502 "data_size": 63488 00:25:28.502 }, 00:25:28.502 { 00:25:28.502 "name": "BaseBdev4", 00:25:28.502 "uuid": "cbb43a9f-769e-596d-b3fb-f787e5abda14", 00:25:28.502 "is_configured": true, 00:25:28.502 "data_offset": 2048, 00:25:28.502 "data_size": 63488 00:25:28.502 } 00:25:28.502 ] 00:25:28.502 }' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 209728 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 209728 ']' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 209728 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 209728 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 209728' 00:25:28.502 killing process with pid 209728 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 209728 00:25:28.502 Received shutdown signal, test time was about 60.000000 seconds 00:25:28.502 00:25:28.502 Latency(us) 00:25:28.502 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:28.502 =================================================================================================================== 00:25:28.502 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:28.502 [2024-07-15 09:29:37.326933] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:28.502 [2024-07-15 09:29:37.327031] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:28.502 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 209728 00:25:28.502 [2024-07-15 09:29:37.327090] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:28.502 [2024-07-15 09:29:37.327107] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d7aba0 name raid_bdev1, state offline 00:25:28.502 [2024-07-15 09:29:37.377916] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:28.761 00:25:28.761 real 0m38.292s 00:25:28.761 user 0m55.075s 00:25:28.761 sys 0m7.052s 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:28.761 ************************************ 00:25:28.761 END TEST raid_rebuild_test_sb 00:25:28.761 ************************************ 00:25:28.761 09:29:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:28.761 09:29:37 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:28.761 09:29:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:28.761 09:29:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:28.761 09:29:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:28.761 ************************************ 00:25:28.761 START TEST raid_rebuild_test_io 00:25:28.761 ************************************ 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:28.761 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=215112 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 215112 /var/tmp/spdk-raid.sock 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 215112 ']' 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:28.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:28.762 09:29:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:29.020 [2024-07-15 09:29:37.744735] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:25:29.020 [2024-07-15 09:29:37.744800] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215112 ] 00:25:29.020 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:29.020 Zero copy mechanism will not be used. 00:25:29.020 [2024-07-15 09:29:37.871541] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.280 [2024-07-15 09:29:37.975748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.280 [2024-07-15 09:29:38.033366] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:29.280 [2024-07-15 09:29:38.033402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:29.280 09:29:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:29.280 09:29:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:29.280 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:29.280 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:29.539 BaseBdev1_malloc 00:25:29.539 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:29.799 [2024-07-15 09:29:38.697831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:29.799 [2024-07-15 09:29:38.697881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.799 [2024-07-15 09:29:38.697904] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2395d40 00:25:29.799 [2024-07-15 09:29:38.697918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.799 [2024-07-15 09:29:38.699694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.799 [2024-07-15 09:29:38.699724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:29.799 BaseBdev1 00:25:29.799 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:29.799 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:30.058 BaseBdev2_malloc 00:25:30.058 09:29:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:30.317 [2024-07-15 09:29:39.189241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:30.317 [2024-07-15 09:29:39.189289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.317 [2024-07-15 09:29:39.189312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2396860 00:25:30.317 [2024-07-15 09:29:39.189325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.317 [2024-07-15 09:29:39.190916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.317 [2024-07-15 09:29:39.190954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:30.317 BaseBdev2 00:25:30.317 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:30.317 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:30.576 BaseBdev3_malloc 00:25:30.576 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:30.835 [2024-07-15 09:29:39.688473] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:30.835 [2024-07-15 09:29:39.688518] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.835 [2024-07-15 09:29:39.688538] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25438f0 00:25:30.835 [2024-07-15 09:29:39.688551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.835 [2024-07-15 09:29:39.690104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.835 [2024-07-15 09:29:39.690134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:30.835 BaseBdev3 00:25:30.835 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:30.835 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:31.094 BaseBdev4_malloc 00:25:31.094 09:29:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:31.353 [2024-07-15 09:29:40.186377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:31.353 [2024-07-15 09:29:40.186428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.353 [2024-07-15 09:29:40.186451] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2542ad0 00:25:31.353 [2024-07-15 09:29:40.186464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.353 [2024-07-15 09:29:40.188067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.353 [2024-07-15 09:29:40.188095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:31.353 BaseBdev4 00:25:31.353 09:29:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:31.612 spare_malloc 00:25:31.612 09:29:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:31.871 spare_delay 00:25:31.871 09:29:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:32.130 [2024-07-15 09:29:40.852779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:32.130 [2024-07-15 09:29:40.852829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.130 [2024-07-15 09:29:40.852851] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25475b0 00:25:32.130 [2024-07-15 09:29:40.852865] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.130 [2024-07-15 09:29:40.854511] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.130 [2024-07-15 09:29:40.854543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:32.130 spare 00:25:32.130 09:29:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:32.389 [2024-07-15 09:29:41.093432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:32.389 [2024-07-15 09:29:41.094785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:32.389 [2024-07-15 09:29:41.094840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:32.389 [2024-07-15 09:29:41.094886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:32.389 [2024-07-15 09:29:41.094976] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24c68a0 00:25:32.389 [2024-07-15 09:29:41.094987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:32.389 [2024-07-15 09:29:41.095206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2540e10 00:25:32.389 [2024-07-15 09:29:41.095367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24c68a0 00:25:32.389 [2024-07-15 09:29:41.095377] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24c68a0 00:25:32.389 [2024-07-15 09:29:41.095497] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.389 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.648 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.648 "name": "raid_bdev1", 00:25:32.648 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:32.648 "strip_size_kb": 0, 00:25:32.648 "state": "online", 00:25:32.648 "raid_level": "raid1", 00:25:32.648 "superblock": false, 00:25:32.648 "num_base_bdevs": 4, 00:25:32.648 "num_base_bdevs_discovered": 4, 00:25:32.648 "num_base_bdevs_operational": 4, 00:25:32.648 "base_bdevs_list": [ 00:25:32.648 { 00:25:32.648 "name": "BaseBdev1", 00:25:32.648 "uuid": "567325ce-526a-5d78-8086-a5c213bf14ed", 00:25:32.648 "is_configured": true, 00:25:32.648 "data_offset": 0, 00:25:32.648 "data_size": 65536 00:25:32.648 }, 00:25:32.648 { 00:25:32.648 "name": "BaseBdev2", 00:25:32.648 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:32.648 "is_configured": true, 00:25:32.648 "data_offset": 0, 00:25:32.648 "data_size": 65536 00:25:32.648 }, 00:25:32.648 { 00:25:32.648 "name": "BaseBdev3", 00:25:32.648 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:32.648 "is_configured": true, 00:25:32.648 "data_offset": 0, 00:25:32.648 "data_size": 65536 00:25:32.648 }, 00:25:32.648 { 00:25:32.648 "name": "BaseBdev4", 00:25:32.648 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:32.648 "is_configured": true, 00:25:32.648 "data_offset": 0, 00:25:32.648 "data_size": 65536 00:25:32.648 } 00:25:32.648 ] 00:25:32.648 }' 00:25:32.648 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.648 09:29:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:33.215 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:33.215 09:29:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:33.215 [2024-07-15 09:29:42.088342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:33.215 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:33.215 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.215 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:33.474 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:33.474 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:33.474 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:33.474 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:33.733 [2024-07-15 09:29:42.447355] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24cc970 00:25:33.733 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:33.733 Zero copy mechanism will not be used. 00:25:33.733 Running I/O for 60 seconds... 00:25:33.733 [2024-07-15 09:29:42.559606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:33.733 [2024-07-15 09:29:42.567777] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x24cc970 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.733 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.991 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.991 "name": "raid_bdev1", 00:25:33.991 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:33.991 "strip_size_kb": 0, 00:25:33.991 "state": "online", 00:25:33.991 "raid_level": "raid1", 00:25:33.991 "superblock": false, 00:25:33.991 "num_base_bdevs": 4, 00:25:33.991 "num_base_bdevs_discovered": 3, 00:25:33.991 "num_base_bdevs_operational": 3, 00:25:33.991 "base_bdevs_list": [ 00:25:33.991 { 00:25:33.991 "name": null, 00:25:33.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.991 "is_configured": false, 00:25:33.991 "data_offset": 0, 00:25:33.991 "data_size": 65536 00:25:33.991 }, 00:25:33.991 { 00:25:33.991 "name": "BaseBdev2", 00:25:33.991 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:33.992 "is_configured": true, 00:25:33.992 "data_offset": 0, 00:25:33.992 "data_size": 65536 00:25:33.992 }, 00:25:33.992 { 00:25:33.992 "name": "BaseBdev3", 00:25:33.992 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:33.992 "is_configured": true, 00:25:33.992 "data_offset": 0, 00:25:33.992 "data_size": 65536 00:25:33.992 }, 00:25:33.992 { 00:25:33.992 "name": "BaseBdev4", 00:25:33.992 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:33.992 "is_configured": true, 00:25:33.992 "data_offset": 0, 00:25:33.992 "data_size": 65536 00:25:33.992 } 00:25:33.992 ] 00:25:33.992 }' 00:25:33.992 09:29:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.992 09:29:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:34.594 09:29:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:34.852 [2024-07-15 09:29:43.710536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.852 09:29:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:34.852 [2024-07-15 09:29:43.793761] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209cfa0 00:25:34.852 [2024-07-15 09:29:43.796202] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:35.111 [2024-07-15 09:29:43.898819] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:35.111 [2024-07-15 09:29:43.899151] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:35.370 [2024-07-15 09:29:44.113181] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:35.370 [2024-07-15 09:29:44.113477] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:35.629 [2024-07-15 09:29:44.523545] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.888 09:29:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.146 [2024-07-15 09:29:44.901815] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:36.146 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.146 "name": "raid_bdev1", 00:25:36.146 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:36.146 "strip_size_kb": 0, 00:25:36.146 "state": "online", 00:25:36.146 "raid_level": "raid1", 00:25:36.146 "superblock": false, 00:25:36.146 "num_base_bdevs": 4, 00:25:36.146 "num_base_bdevs_discovered": 4, 00:25:36.146 "num_base_bdevs_operational": 4, 00:25:36.146 "process": { 00:25:36.146 "type": "rebuild", 00:25:36.146 "target": "spare", 00:25:36.146 "progress": { 00:25:36.146 "blocks": 16384, 00:25:36.146 "percent": 25 00:25:36.146 } 00:25:36.146 }, 00:25:36.146 "base_bdevs_list": [ 00:25:36.146 { 00:25:36.146 "name": "spare", 00:25:36.146 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:36.146 "is_configured": true, 00:25:36.146 "data_offset": 0, 00:25:36.146 "data_size": 65536 00:25:36.146 }, 00:25:36.146 { 00:25:36.146 "name": "BaseBdev2", 00:25:36.146 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:36.146 "is_configured": true, 00:25:36.146 "data_offset": 0, 00:25:36.146 "data_size": 65536 00:25:36.146 }, 00:25:36.146 { 00:25:36.146 "name": "BaseBdev3", 00:25:36.146 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:36.146 "is_configured": true, 00:25:36.146 "data_offset": 0, 00:25:36.146 "data_size": 65536 00:25:36.146 }, 00:25:36.146 { 00:25:36.146 "name": "BaseBdev4", 00:25:36.146 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:36.146 "is_configured": true, 00:25:36.146 "data_offset": 0, 00:25:36.146 "data_size": 65536 00:25:36.146 } 00:25:36.146 ] 00:25:36.146 }' 00:25:36.146 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.146 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:36.146 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.403 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.403 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:36.403 [2024-07-15 09:29:45.147071] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:36.403 [2024-07-15 09:29:45.291757] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:36.403 [2024-07-15 09:29:45.347895] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.661 [2024-07-15 09:29:45.502574] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:36.661 [2024-07-15 09:29:45.515331] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.661 [2024-07-15 09:29:45.515369] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.661 [2024-07-15 09:29:45.515380] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:36.661 [2024-07-15 09:29:45.531305] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x24cc970 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.661 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.920 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.920 "name": "raid_bdev1", 00:25:36.920 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:36.920 "strip_size_kb": 0, 00:25:36.920 "state": "online", 00:25:36.920 "raid_level": "raid1", 00:25:36.920 "superblock": false, 00:25:36.920 "num_base_bdevs": 4, 00:25:36.920 "num_base_bdevs_discovered": 3, 00:25:36.920 "num_base_bdevs_operational": 3, 00:25:36.920 "base_bdevs_list": [ 00:25:36.920 { 00:25:36.920 "name": null, 00:25:36.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.920 "is_configured": false, 00:25:36.920 "data_offset": 0, 00:25:36.920 "data_size": 65536 00:25:36.920 }, 00:25:36.920 { 00:25:36.920 "name": "BaseBdev2", 00:25:36.920 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:36.920 "is_configured": true, 00:25:36.920 "data_offset": 0, 00:25:36.920 "data_size": 65536 00:25:36.920 }, 00:25:36.920 { 00:25:36.920 "name": "BaseBdev3", 00:25:36.920 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:36.920 "is_configured": true, 00:25:36.920 "data_offset": 0, 00:25:36.920 "data_size": 65536 00:25:36.920 }, 00:25:36.920 { 00:25:36.920 "name": "BaseBdev4", 00:25:36.920 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:36.920 "is_configured": true, 00:25:36.920 "data_offset": 0, 00:25:36.920 "data_size": 65536 00:25:36.920 } 00:25:36.920 ] 00:25:36.920 }' 00:25:36.920 09:29:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.920 09:29:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.857 "name": "raid_bdev1", 00:25:37.857 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:37.857 "strip_size_kb": 0, 00:25:37.857 "state": "online", 00:25:37.857 "raid_level": "raid1", 00:25:37.857 "superblock": false, 00:25:37.857 "num_base_bdevs": 4, 00:25:37.857 "num_base_bdevs_discovered": 3, 00:25:37.857 "num_base_bdevs_operational": 3, 00:25:37.857 "base_bdevs_list": [ 00:25:37.857 { 00:25:37.857 "name": null, 00:25:37.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.857 "is_configured": false, 00:25:37.857 "data_offset": 0, 00:25:37.857 "data_size": 65536 00:25:37.857 }, 00:25:37.857 { 00:25:37.857 "name": "BaseBdev2", 00:25:37.857 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:37.857 "is_configured": true, 00:25:37.857 "data_offset": 0, 00:25:37.857 "data_size": 65536 00:25:37.857 }, 00:25:37.857 { 00:25:37.857 "name": "BaseBdev3", 00:25:37.857 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:37.857 "is_configured": true, 00:25:37.857 "data_offset": 0, 00:25:37.857 "data_size": 65536 00:25:37.857 }, 00:25:37.857 { 00:25:37.857 "name": "BaseBdev4", 00:25:37.857 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:37.857 "is_configured": true, 00:25:37.857 "data_offset": 0, 00:25:37.857 "data_size": 65536 00:25:37.857 } 00:25:37.857 ] 00:25:37.857 }' 00:25:37.857 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.116 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:38.116 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.116 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:38.116 09:29:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:38.375 [2024-07-15 09:29:47.100041] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:38.375 09:29:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:38.375 [2024-07-15 09:29:47.184860] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c9270 00:25:38.375 [2024-07-15 09:29:47.186399] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:38.375 [2024-07-15 09:29:47.305439] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:38.375 [2024-07-15 09:29:47.306621] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:38.634 [2024-07-15 09:29:47.529046] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:38.634 [2024-07-15 09:29:47.529636] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:39.202 [2024-07-15 09:29:47.865467] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:39.202 [2024-07-15 09:29:47.875838] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:39.202 [2024-07-15 09:29:48.107497] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.461 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:39.720 "name": "raid_bdev1", 00:25:39.720 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:39.720 "strip_size_kb": 0, 00:25:39.720 "state": "online", 00:25:39.720 "raid_level": "raid1", 00:25:39.720 "superblock": false, 00:25:39.720 "num_base_bdevs": 4, 00:25:39.720 "num_base_bdevs_discovered": 4, 00:25:39.720 "num_base_bdevs_operational": 4, 00:25:39.720 "process": { 00:25:39.720 "type": "rebuild", 00:25:39.720 "target": "spare", 00:25:39.720 "progress": { 00:25:39.720 "blocks": 12288, 00:25:39.720 "percent": 18 00:25:39.720 } 00:25:39.720 }, 00:25:39.720 "base_bdevs_list": [ 00:25:39.720 { 00:25:39.720 "name": "spare", 00:25:39.720 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:39.720 "is_configured": true, 00:25:39.720 "data_offset": 0, 00:25:39.720 "data_size": 65536 00:25:39.720 }, 00:25:39.720 { 00:25:39.720 "name": "BaseBdev2", 00:25:39.720 "uuid": "6d8ff09b-005a-5bb9-8b35-b7c3959825c5", 00:25:39.720 "is_configured": true, 00:25:39.720 "data_offset": 0, 00:25:39.720 "data_size": 65536 00:25:39.720 }, 00:25:39.720 { 00:25:39.720 "name": "BaseBdev3", 00:25:39.720 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:39.720 "is_configured": true, 00:25:39.720 "data_offset": 0, 00:25:39.720 "data_size": 65536 00:25:39.720 }, 00:25:39.720 { 00:25:39.720 "name": "BaseBdev4", 00:25:39.720 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:39.720 "is_configured": true, 00:25:39.720 "data_offset": 0, 00:25:39.720 "data_size": 65536 00:25:39.720 } 00:25:39.720 ] 00:25:39.720 }' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:39.720 [2024-07-15 09:29:48.443869] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:39.720 09:29:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:39.720 [2024-07-15 09:29:48.669012] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:39.979 [2024-07-15 09:29:48.749995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:40.238 [2024-07-15 09:29:48.992734] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x24cc970 00:25:40.238 [2024-07-15 09:29:48.992771] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x24c9270 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.238 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.497 [2024-07-15 09:29:49.262637] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.497 "name": "raid_bdev1", 00:25:40.497 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:40.497 "strip_size_kb": 0, 00:25:40.497 "state": "online", 00:25:40.497 "raid_level": "raid1", 00:25:40.497 "superblock": false, 00:25:40.497 "num_base_bdevs": 4, 00:25:40.497 "num_base_bdevs_discovered": 3, 00:25:40.497 "num_base_bdevs_operational": 3, 00:25:40.497 "process": { 00:25:40.497 "type": "rebuild", 00:25:40.497 "target": "spare", 00:25:40.497 "progress": { 00:25:40.497 "blocks": 20480, 00:25:40.497 "percent": 31 00:25:40.497 } 00:25:40.497 }, 00:25:40.497 "base_bdevs_list": [ 00:25:40.497 { 00:25:40.497 "name": "spare", 00:25:40.497 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:40.497 "is_configured": true, 00:25:40.497 "data_offset": 0, 00:25:40.497 "data_size": 65536 00:25:40.497 }, 00:25:40.497 { 00:25:40.497 "name": null, 00:25:40.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.497 "is_configured": false, 00:25:40.497 "data_offset": 0, 00:25:40.497 "data_size": 65536 00:25:40.497 }, 00:25:40.497 { 00:25:40.497 "name": "BaseBdev3", 00:25:40.497 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:40.497 "is_configured": true, 00:25:40.497 "data_offset": 0, 00:25:40.497 "data_size": 65536 00:25:40.497 }, 00:25:40.497 { 00:25:40.497 "name": "BaseBdev4", 00:25:40.497 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:40.497 "is_configured": true, 00:25:40.497 "data_offset": 0, 00:25:40.497 "data_size": 65536 00:25:40.497 } 00:25:40.497 ] 00:25:40.497 }' 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=933 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.497 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.756 "name": "raid_bdev1", 00:25:40.756 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:40.756 "strip_size_kb": 0, 00:25:40.756 "state": "online", 00:25:40.756 "raid_level": "raid1", 00:25:40.756 "superblock": false, 00:25:40.756 "num_base_bdevs": 4, 00:25:40.756 "num_base_bdevs_discovered": 3, 00:25:40.756 "num_base_bdevs_operational": 3, 00:25:40.756 "process": { 00:25:40.756 "type": "rebuild", 00:25:40.756 "target": "spare", 00:25:40.756 "progress": { 00:25:40.756 "blocks": 24576, 00:25:40.756 "percent": 37 00:25:40.756 } 00:25:40.756 }, 00:25:40.756 "base_bdevs_list": [ 00:25:40.756 { 00:25:40.756 "name": "spare", 00:25:40.756 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:40.756 "is_configured": true, 00:25:40.756 "data_offset": 0, 00:25:40.756 "data_size": 65536 00:25:40.756 }, 00:25:40.756 { 00:25:40.756 "name": null, 00:25:40.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.756 "is_configured": false, 00:25:40.756 "data_offset": 0, 00:25:40.756 "data_size": 65536 00:25:40.756 }, 00:25:40.756 { 00:25:40.756 "name": "BaseBdev3", 00:25:40.756 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:40.756 "is_configured": true, 00:25:40.756 "data_offset": 0, 00:25:40.756 "data_size": 65536 00:25:40.756 }, 00:25:40.756 { 00:25:40.756 "name": "BaseBdev4", 00:25:40.756 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:40.756 "is_configured": true, 00:25:40.756 "data_offset": 0, 00:25:40.756 "data_size": 65536 00:25:40.756 } 00:25:40.756 ] 00:25:40.756 }' 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.756 [2024-07-15 09:29:49.617818] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:40.756 [2024-07-15 09:29:49.618261] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.756 09:29:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:41.015 [2024-07-15 09:29:49.757644] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:41.274 [2024-07-15 09:29:50.138038] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.841 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.110 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.110 "name": "raid_bdev1", 00:25:42.110 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:42.110 "strip_size_kb": 0, 00:25:42.110 "state": "online", 00:25:42.110 "raid_level": "raid1", 00:25:42.110 "superblock": false, 00:25:42.110 "num_base_bdevs": 4, 00:25:42.110 "num_base_bdevs_discovered": 3, 00:25:42.110 "num_base_bdevs_operational": 3, 00:25:42.110 "process": { 00:25:42.110 "type": "rebuild", 00:25:42.110 "target": "spare", 00:25:42.110 "progress": { 00:25:42.110 "blocks": 45056, 00:25:42.110 "percent": 68 00:25:42.110 } 00:25:42.110 }, 00:25:42.110 "base_bdevs_list": [ 00:25:42.110 { 00:25:42.110 "name": "spare", 00:25:42.110 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:42.110 "is_configured": true, 00:25:42.110 "data_offset": 0, 00:25:42.110 "data_size": 65536 00:25:42.110 }, 00:25:42.110 { 00:25:42.110 "name": null, 00:25:42.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.110 "is_configured": false, 00:25:42.110 "data_offset": 0, 00:25:42.110 "data_size": 65536 00:25:42.110 }, 00:25:42.110 { 00:25:42.110 "name": "BaseBdev3", 00:25:42.110 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:42.110 "is_configured": true, 00:25:42.110 "data_offset": 0, 00:25:42.110 "data_size": 65536 00:25:42.110 }, 00:25:42.110 { 00:25:42.110 "name": "BaseBdev4", 00:25:42.110 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:42.110 "is_configured": true, 00:25:42.110 "data_offset": 0, 00:25:42.110 "data_size": 65536 00:25:42.110 } 00:25:42.110 ] 00:25:42.110 }' 00:25:42.110 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.110 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.110 09:29:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.110 09:29:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.110 09:29:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:43.048 [2024-07-15 09:29:51.925877] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.307 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.307 [2024-07-15 09:29:52.026148] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:43.307 [2024-07-15 09:29:52.028488] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.874 "name": "raid_bdev1", 00:25:43.874 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:43.874 "strip_size_kb": 0, 00:25:43.874 "state": "online", 00:25:43.874 "raid_level": "raid1", 00:25:43.874 "superblock": false, 00:25:43.874 "num_base_bdevs": 4, 00:25:43.874 "num_base_bdevs_discovered": 3, 00:25:43.874 "num_base_bdevs_operational": 3, 00:25:43.874 "base_bdevs_list": [ 00:25:43.874 { 00:25:43.874 "name": "spare", 00:25:43.874 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:43.874 "is_configured": true, 00:25:43.874 "data_offset": 0, 00:25:43.874 "data_size": 65536 00:25:43.874 }, 00:25:43.874 { 00:25:43.874 "name": null, 00:25:43.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.874 "is_configured": false, 00:25:43.874 "data_offset": 0, 00:25:43.874 "data_size": 65536 00:25:43.874 }, 00:25:43.874 { 00:25:43.874 "name": "BaseBdev3", 00:25:43.874 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:43.874 "is_configured": true, 00:25:43.874 "data_offset": 0, 00:25:43.874 "data_size": 65536 00:25:43.874 }, 00:25:43.874 { 00:25:43.874 "name": "BaseBdev4", 00:25:43.874 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:43.874 "is_configured": true, 00:25:43.874 "data_offset": 0, 00:25:43.874 "data_size": 65536 00:25:43.874 } 00:25:43.874 ] 00:25:43.874 }' 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.874 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.133 "name": "raid_bdev1", 00:25:44.133 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:44.133 "strip_size_kb": 0, 00:25:44.133 "state": "online", 00:25:44.133 "raid_level": "raid1", 00:25:44.133 "superblock": false, 00:25:44.133 "num_base_bdevs": 4, 00:25:44.133 "num_base_bdevs_discovered": 3, 00:25:44.133 "num_base_bdevs_operational": 3, 00:25:44.133 "base_bdevs_list": [ 00:25:44.133 { 00:25:44.133 "name": "spare", 00:25:44.133 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:44.133 "is_configured": true, 00:25:44.133 "data_offset": 0, 00:25:44.133 "data_size": 65536 00:25:44.133 }, 00:25:44.133 { 00:25:44.133 "name": null, 00:25:44.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.133 "is_configured": false, 00:25:44.133 "data_offset": 0, 00:25:44.133 "data_size": 65536 00:25:44.133 }, 00:25:44.133 { 00:25:44.133 "name": "BaseBdev3", 00:25:44.133 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:44.133 "is_configured": true, 00:25:44.133 "data_offset": 0, 00:25:44.133 "data_size": 65536 00:25:44.133 }, 00:25:44.133 { 00:25:44.133 "name": "BaseBdev4", 00:25:44.133 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:44.133 "is_configured": true, 00:25:44.133 "data_offset": 0, 00:25:44.133 "data_size": 65536 00:25:44.133 } 00:25:44.133 ] 00:25:44.133 }' 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.133 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.134 09:29:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.392 09:29:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.392 "name": "raid_bdev1", 00:25:44.392 "uuid": "03ece04e-216f-476a-9e1a-74644379e05a", 00:25:44.392 "strip_size_kb": 0, 00:25:44.392 "state": "online", 00:25:44.392 "raid_level": "raid1", 00:25:44.392 "superblock": false, 00:25:44.392 "num_base_bdevs": 4, 00:25:44.392 "num_base_bdevs_discovered": 3, 00:25:44.392 "num_base_bdevs_operational": 3, 00:25:44.392 "base_bdevs_list": [ 00:25:44.392 { 00:25:44.392 "name": "spare", 00:25:44.392 "uuid": "12ce5fdc-ca30-5a15-91d2-88968bef5303", 00:25:44.392 "is_configured": true, 00:25:44.392 "data_offset": 0, 00:25:44.392 "data_size": 65536 00:25:44.392 }, 00:25:44.392 { 00:25:44.392 "name": null, 00:25:44.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.392 "is_configured": false, 00:25:44.392 "data_offset": 0, 00:25:44.392 "data_size": 65536 00:25:44.392 }, 00:25:44.392 { 00:25:44.392 "name": "BaseBdev3", 00:25:44.392 "uuid": "db555d78-22f5-556f-b7ea-fa4f2bdcc1e1", 00:25:44.392 "is_configured": true, 00:25:44.392 "data_offset": 0, 00:25:44.392 "data_size": 65536 00:25:44.392 }, 00:25:44.392 { 00:25:44.392 "name": "BaseBdev4", 00:25:44.392 "uuid": "ab551a21-2a5f-5df4-b430-9f2eefde60cf", 00:25:44.392 "is_configured": true, 00:25:44.392 "data_offset": 0, 00:25:44.392 "data_size": 65536 00:25:44.392 } 00:25:44.392 ] 00:25:44.392 }' 00:25:44.392 09:29:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.392 09:29:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:44.958 09:29:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:45.217 [2024-07-15 09:29:54.027277] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:45.217 [2024-07-15 09:29:54.027313] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:45.217 00:25:45.217 Latency(us) 00:25:45.217 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:45.217 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:45.217 raid_bdev1 : 11.65 88.36 265.09 0.00 0.00 15606.30 300.97 118534.68 00:25:45.217 =================================================================================================================== 00:25:45.217 Total : 88.36 265.09 0.00 0.00 15606.30 300.97 118534.68 00:25:45.217 [2024-07-15 09:29:54.127494] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:45.217 [2024-07-15 09:29:54.127523] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:45.217 [2024-07-15 09:29:54.127615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:45.217 [2024-07-15 09:29:54.127627] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c68a0 name raid_bdev1, state offline 00:25:45.217 0 00:25:45.217 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.217 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:45.476 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:45.477 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.477 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:45.736 /dev/nbd0 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:45.736 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:45.737 1+0 records in 00:25:45.737 1+0 records out 00:25:45.737 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280123 s, 14.6 MB/s 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.737 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:45.996 /dev/nbd1 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:45.996 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.256 1+0 records in 00:25:46.256 1+0 records out 00:25:46.256 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271554 s, 15.1 MB/s 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.256 09:29:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.256 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:46.516 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:46.517 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:46.517 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:46.517 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:46.517 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.517 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:46.775 /dev/nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.775 1+0 records in 00:25:46.775 1+0 records out 00:25:46.775 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236407 s, 17.3 MB/s 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.775 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:47.036 09:29:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.295 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 215112 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 215112 ']' 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 215112 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 215112 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 215112' 00:25:47.554 killing process with pid 215112 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 215112 00:25:47.554 Received shutdown signal, test time was about 13.817442 seconds 00:25:47.554 00:25:47.554 Latency(us) 00:25:47.554 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:47.554 =================================================================================================================== 00:25:47.554 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:47.554 [2024-07-15 09:29:56.301053] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:47.554 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 215112 00:25:47.554 [2024-07-15 09:29:56.345781] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:47.814 00:25:47.814 real 0m18.892s 00:25:47.814 user 0m29.595s 00:25:47.814 sys 0m3.410s 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:47.814 ************************************ 00:25:47.814 END TEST raid_rebuild_test_io 00:25:47.814 ************************************ 00:25:47.814 09:29:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:47.814 09:29:56 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:47.814 09:29:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:47.814 09:29:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:47.814 09:29:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:47.814 ************************************ 00:25:47.814 START TEST raid_rebuild_test_sb_io 00:25:47.814 ************************************ 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=217820 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 217820 /var/tmp/spdk-raid.sock 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 217820 ']' 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:47.814 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:47.814 09:29:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:47.814 [2024-07-15 09:29:56.734028] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:25:47.814 [2024-07-15 09:29:56.734095] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217820 ] 00:25:47.814 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:47.814 Zero copy mechanism will not be used. 00:25:48.073 [2024-07-15 09:29:56.845757] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.073 [2024-07-15 09:29:56.950208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.073 [2024-07-15 09:29:57.008766] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:48.073 [2024-07-15 09:29:57.008802] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:49.011 09:29:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:49.011 09:29:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:49.011 09:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:49.011 09:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:49.011 BaseBdev1_malloc 00:25:49.011 09:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:49.270 [2024-07-15 09:29:58.152612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:49.270 [2024-07-15 09:29:58.152669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.270 [2024-07-15 09:29:58.152693] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x90fd40 00:25:49.270 [2024-07-15 09:29:58.152706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.270 [2024-07-15 09:29:58.154399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.270 [2024-07-15 09:29:58.154431] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:49.270 BaseBdev1 00:25:49.270 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:49.270 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:49.529 BaseBdev2_malloc 00:25:49.529 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:49.788 [2024-07-15 09:29:58.646917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:49.788 [2024-07-15 09:29:58.646977] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.788 [2024-07-15 09:29:58.647001] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x910860 00:25:49.788 [2024-07-15 09:29:58.647015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.788 [2024-07-15 09:29:58.648492] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.788 [2024-07-15 09:29:58.648522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:49.788 BaseBdev2 00:25:49.788 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:49.788 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:50.047 BaseBdev3_malloc 00:25:50.047 09:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:50.306 [2024-07-15 09:29:59.140981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:50.306 [2024-07-15 09:29:59.141027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:50.306 [2024-07-15 09:29:59.141047] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xabd8f0 00:25:50.306 [2024-07-15 09:29:59.141060] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:50.306 [2024-07-15 09:29:59.142435] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:50.306 [2024-07-15 09:29:59.142467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:50.306 BaseBdev3 00:25:50.306 09:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:50.306 09:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:50.565 BaseBdev4_malloc 00:25:50.565 09:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:50.823 [2024-07-15 09:29:59.634790] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:50.823 [2024-07-15 09:29:59.634834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:50.823 [2024-07-15 09:29:59.634854] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xabcad0 00:25:50.823 [2024-07-15 09:29:59.634866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:50.823 [2024-07-15 09:29:59.636227] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:50.823 [2024-07-15 09:29:59.636255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:50.823 BaseBdev4 00:25:50.823 09:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:51.108 spare_malloc 00:25:51.108 09:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:51.379 spare_delay 00:25:51.379 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:51.638 [2024-07-15 09:30:00.385441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:51.638 [2024-07-15 09:30:00.385496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.638 [2024-07-15 09:30:00.385519] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xac15b0 00:25:51.638 [2024-07-15 09:30:00.385532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.638 [2024-07-15 09:30:00.387098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.638 [2024-07-15 09:30:00.387128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:51.638 spare 00:25:51.638 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:51.897 [2024-07-15 09:30:00.630120] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:51.897 [2024-07-15 09:30:00.631419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:51.897 [2024-07-15 09:30:00.631476] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:51.897 [2024-07-15 09:30:00.631523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:51.897 [2024-07-15 09:30:00.631726] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa408a0 00:25:51.897 [2024-07-15 09:30:00.631737] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:51.897 [2024-07-15 09:30:00.631959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xabae10 00:25:51.897 [2024-07-15 09:30:00.632120] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa408a0 00:25:51.897 [2024-07-15 09:30:00.632131] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa408a0 00:25:51.897 [2024-07-15 09:30:00.632234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.897 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.156 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.156 "name": "raid_bdev1", 00:25:52.156 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:52.156 "strip_size_kb": 0, 00:25:52.156 "state": "online", 00:25:52.156 "raid_level": "raid1", 00:25:52.156 "superblock": true, 00:25:52.156 "num_base_bdevs": 4, 00:25:52.156 "num_base_bdevs_discovered": 4, 00:25:52.156 "num_base_bdevs_operational": 4, 00:25:52.156 "base_bdevs_list": [ 00:25:52.156 { 00:25:52.156 "name": "BaseBdev1", 00:25:52.156 "uuid": "070c3992-5d7d-5dc6-b845-5959d75bc4dd", 00:25:52.156 "is_configured": true, 00:25:52.156 "data_offset": 2048, 00:25:52.156 "data_size": 63488 00:25:52.156 }, 00:25:52.156 { 00:25:52.156 "name": "BaseBdev2", 00:25:52.156 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:52.156 "is_configured": true, 00:25:52.156 "data_offset": 2048, 00:25:52.156 "data_size": 63488 00:25:52.156 }, 00:25:52.156 { 00:25:52.156 "name": "BaseBdev3", 00:25:52.156 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:52.156 "is_configured": true, 00:25:52.156 "data_offset": 2048, 00:25:52.156 "data_size": 63488 00:25:52.156 }, 00:25:52.156 { 00:25:52.156 "name": "BaseBdev4", 00:25:52.156 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:52.156 "is_configured": true, 00:25:52.156 "data_offset": 2048, 00:25:52.156 "data_size": 63488 00:25:52.156 } 00:25:52.156 ] 00:25:52.156 }' 00:25:52.156 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.156 09:30:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:52.722 09:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:52.722 09:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:52.979 [2024-07-15 09:30:01.737327] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:52.979 09:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:52.979 09:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.979 09:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:53.238 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:53.238 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:53.238 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:53.238 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:53.238 [2024-07-15 09:30:02.108096] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x90f670 00:25:53.238 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:53.238 Zero copy mechanism will not be used. 00:25:53.238 Running I/O for 60 seconds... 00:25:53.496 [2024-07-15 09:30:02.235438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:53.496 [2024-07-15 09:30:02.243666] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x90f670 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.496 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.754 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.754 "name": "raid_bdev1", 00:25:53.754 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:53.754 "strip_size_kb": 0, 00:25:53.754 "state": "online", 00:25:53.754 "raid_level": "raid1", 00:25:53.754 "superblock": true, 00:25:53.754 "num_base_bdevs": 4, 00:25:53.754 "num_base_bdevs_discovered": 3, 00:25:53.754 "num_base_bdevs_operational": 3, 00:25:53.754 "base_bdevs_list": [ 00:25:53.754 { 00:25:53.754 "name": null, 00:25:53.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.754 "is_configured": false, 00:25:53.754 "data_offset": 2048, 00:25:53.754 "data_size": 63488 00:25:53.754 }, 00:25:53.754 { 00:25:53.754 "name": "BaseBdev2", 00:25:53.754 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:53.754 "is_configured": true, 00:25:53.754 "data_offset": 2048, 00:25:53.754 "data_size": 63488 00:25:53.754 }, 00:25:53.754 { 00:25:53.754 "name": "BaseBdev3", 00:25:53.754 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:53.754 "is_configured": true, 00:25:53.754 "data_offset": 2048, 00:25:53.754 "data_size": 63488 00:25:53.754 }, 00:25:53.754 { 00:25:53.754 "name": "BaseBdev4", 00:25:53.754 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:53.754 "is_configured": true, 00:25:53.754 "data_offset": 2048, 00:25:53.754 "data_size": 63488 00:25:53.754 } 00:25:53.754 ] 00:25:53.754 }' 00:25:53.754 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.754 09:30:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:54.320 09:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:54.578 [2024-07-15 09:30:03.413439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:54.578 [2024-07-15 09:30:03.450732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa42ba0 00:25:54.578 [2024-07-15 09:30:03.453141] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:54.578 09:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:54.836 [2024-07-15 09:30:03.582804] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:54.836 [2024-07-15 09:30:03.583207] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:54.836 [2024-07-15 09:30:03.732993] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:54.836 [2024-07-15 09:30:03.733691] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:55.401 [2024-07-15 09:30:04.078188] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:55.401 [2024-07-15 09:30:04.308622] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:55.401 [2024-07-15 09:30:04.308882] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.659 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.917 [2024-07-15 09:30:04.654414] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:55.917 [2024-07-15 09:30:04.655577] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.917 "name": "raid_bdev1", 00:25:55.917 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:55.917 "strip_size_kb": 0, 00:25:55.917 "state": "online", 00:25:55.917 "raid_level": "raid1", 00:25:55.917 "superblock": true, 00:25:55.917 "num_base_bdevs": 4, 00:25:55.917 "num_base_bdevs_discovered": 4, 00:25:55.917 "num_base_bdevs_operational": 4, 00:25:55.917 "process": { 00:25:55.917 "type": "rebuild", 00:25:55.917 "target": "spare", 00:25:55.917 "progress": { 00:25:55.917 "blocks": 14336, 00:25:55.917 "percent": 22 00:25:55.917 } 00:25:55.917 }, 00:25:55.917 "base_bdevs_list": [ 00:25:55.917 { 00:25:55.917 "name": "spare", 00:25:55.917 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:25:55.917 "is_configured": true, 00:25:55.917 "data_offset": 2048, 00:25:55.917 "data_size": 63488 00:25:55.917 }, 00:25:55.917 { 00:25:55.917 "name": "BaseBdev2", 00:25:55.917 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:55.917 "is_configured": true, 00:25:55.917 "data_offset": 2048, 00:25:55.917 "data_size": 63488 00:25:55.917 }, 00:25:55.917 { 00:25:55.917 "name": "BaseBdev3", 00:25:55.917 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:55.917 "is_configured": true, 00:25:55.917 "data_offset": 2048, 00:25:55.917 "data_size": 63488 00:25:55.917 }, 00:25:55.917 { 00:25:55.917 "name": "BaseBdev4", 00:25:55.917 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:55.917 "is_configured": true, 00:25:55.917 "data_offset": 2048, 00:25:55.917 "data_size": 63488 00:25:55.917 } 00:25:55.917 ] 00:25:55.917 }' 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:55.917 09:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:56.177 [2024-07-15 09:30:04.898860] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:56.177 [2024-07-15 09:30:05.050669] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:56.436 [2024-07-15 09:30:05.137309] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:56.436 [2024-07-15 09:30:05.248007] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:56.436 [2024-07-15 09:30:05.270166] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:56.436 [2024-07-15 09:30:05.270210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:56.436 [2024-07-15 09:30:05.270229] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:56.436 [2024-07-15 09:30:05.301874] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x90f670 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.436 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.695 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.695 "name": "raid_bdev1", 00:25:56.695 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:56.695 "strip_size_kb": 0, 00:25:56.695 "state": "online", 00:25:56.695 "raid_level": "raid1", 00:25:56.695 "superblock": true, 00:25:56.695 "num_base_bdevs": 4, 00:25:56.695 "num_base_bdevs_discovered": 3, 00:25:56.695 "num_base_bdevs_operational": 3, 00:25:56.695 "base_bdevs_list": [ 00:25:56.695 { 00:25:56.695 "name": null, 00:25:56.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.695 "is_configured": false, 00:25:56.695 "data_offset": 2048, 00:25:56.695 "data_size": 63488 00:25:56.695 }, 00:25:56.695 { 00:25:56.695 "name": "BaseBdev2", 00:25:56.695 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:56.695 "is_configured": true, 00:25:56.695 "data_offset": 2048, 00:25:56.695 "data_size": 63488 00:25:56.695 }, 00:25:56.695 { 00:25:56.695 "name": "BaseBdev3", 00:25:56.695 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:56.695 "is_configured": true, 00:25:56.695 "data_offset": 2048, 00:25:56.695 "data_size": 63488 00:25:56.695 }, 00:25:56.695 { 00:25:56.695 "name": "BaseBdev4", 00:25:56.695 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:56.695 "is_configured": true, 00:25:56.695 "data_offset": 2048, 00:25:56.695 "data_size": 63488 00:25:56.695 } 00:25:56.695 ] 00:25:56.695 }' 00:25:56.695 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.695 09:30:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.632 "name": "raid_bdev1", 00:25:57.632 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:57.632 "strip_size_kb": 0, 00:25:57.632 "state": "online", 00:25:57.632 "raid_level": "raid1", 00:25:57.632 "superblock": true, 00:25:57.632 "num_base_bdevs": 4, 00:25:57.632 "num_base_bdevs_discovered": 3, 00:25:57.632 "num_base_bdevs_operational": 3, 00:25:57.632 "base_bdevs_list": [ 00:25:57.632 { 00:25:57.632 "name": null, 00:25:57.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.632 "is_configured": false, 00:25:57.632 "data_offset": 2048, 00:25:57.632 "data_size": 63488 00:25:57.632 }, 00:25:57.632 { 00:25:57.632 "name": "BaseBdev2", 00:25:57.632 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:57.632 "is_configured": true, 00:25:57.632 "data_offset": 2048, 00:25:57.632 "data_size": 63488 00:25:57.632 }, 00:25:57.632 { 00:25:57.632 "name": "BaseBdev3", 00:25:57.632 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:57.632 "is_configured": true, 00:25:57.632 "data_offset": 2048, 00:25:57.632 "data_size": 63488 00:25:57.632 }, 00:25:57.632 { 00:25:57.632 "name": "BaseBdev4", 00:25:57.632 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:57.632 "is_configured": true, 00:25:57.632 "data_offset": 2048, 00:25:57.632 "data_size": 63488 00:25:57.632 } 00:25:57.632 ] 00:25:57.632 }' 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:57.632 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:57.891 [2024-07-15 09:30:06.799055] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:58.149 09:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:58.149 [2024-07-15 09:30:06.876091] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab5ad0 00:25:58.149 [2024-07-15 09:30:06.877677] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:58.149 [2024-07-15 09:30:07.010912] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:58.408 [2024-07-15 09:30:07.141315] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:58.408 [2024-07-15 09:30:07.141628] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:58.667 [2024-07-15 09:30:07.536990] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:58.926 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.926 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.185 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:59.185 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:59.185 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.185 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.185 09:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.185 [2024-07-15 09:30:07.920722] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:59.185 [2024-07-15 09:30:08.135592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.444 "name": "raid_bdev1", 00:25:59.444 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:25:59.444 "strip_size_kb": 0, 00:25:59.444 "state": "online", 00:25:59.444 "raid_level": "raid1", 00:25:59.444 "superblock": true, 00:25:59.444 "num_base_bdevs": 4, 00:25:59.444 "num_base_bdevs_discovered": 4, 00:25:59.444 "num_base_bdevs_operational": 4, 00:25:59.444 "process": { 00:25:59.444 "type": "rebuild", 00:25:59.444 "target": "spare", 00:25:59.444 "progress": { 00:25:59.444 "blocks": 14336, 00:25:59.444 "percent": 22 00:25:59.444 } 00:25:59.444 }, 00:25:59.444 "base_bdevs_list": [ 00:25:59.444 { 00:25:59.444 "name": "spare", 00:25:59.444 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:25:59.444 "is_configured": true, 00:25:59.444 "data_offset": 2048, 00:25:59.444 "data_size": 63488 00:25:59.444 }, 00:25:59.444 { 00:25:59.444 "name": "BaseBdev2", 00:25:59.444 "uuid": "068e8d61-3be8-5c28-9715-d90fa189aac7", 00:25:59.444 "is_configured": true, 00:25:59.444 "data_offset": 2048, 00:25:59.444 "data_size": 63488 00:25:59.444 }, 00:25:59.444 { 00:25:59.444 "name": "BaseBdev3", 00:25:59.444 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:25:59.444 "is_configured": true, 00:25:59.444 "data_offset": 2048, 00:25:59.444 "data_size": 63488 00:25:59.444 }, 00:25:59.444 { 00:25:59.444 "name": "BaseBdev4", 00:25:59.444 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:25:59.444 "is_configured": true, 00:25:59.444 "data_offset": 2048, 00:25:59.444 "data_size": 63488 00:25:59.444 } 00:25:59.444 ] 00:25:59.444 }' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:59.444 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:59.444 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:59.702 [2024-07-15 09:30:08.464315] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:59.702 [2024-07-15 09:30:08.464843] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:59.962 [2024-07-15 09:30:08.677989] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x90f670 00:25:59.962 [2024-07-15 09:30:08.678018] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xab5ad0 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.962 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.221 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.221 "name": "raid_bdev1", 00:26:00.221 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:00.221 "strip_size_kb": 0, 00:26:00.221 "state": "online", 00:26:00.221 "raid_level": "raid1", 00:26:00.221 "superblock": true, 00:26:00.221 "num_base_bdevs": 4, 00:26:00.221 "num_base_bdevs_discovered": 3, 00:26:00.221 "num_base_bdevs_operational": 3, 00:26:00.221 "process": { 00:26:00.221 "type": "rebuild", 00:26:00.221 "target": "spare", 00:26:00.221 "progress": { 00:26:00.221 "blocks": 24576, 00:26:00.221 "percent": 38 00:26:00.221 } 00:26:00.221 }, 00:26:00.221 "base_bdevs_list": [ 00:26:00.221 { 00:26:00.221 "name": "spare", 00:26:00.221 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:00.221 "is_configured": true, 00:26:00.221 "data_offset": 2048, 00:26:00.221 "data_size": 63488 00:26:00.221 }, 00:26:00.221 { 00:26:00.221 "name": null, 00:26:00.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.221 "is_configured": false, 00:26:00.221 "data_offset": 2048, 00:26:00.221 "data_size": 63488 00:26:00.221 }, 00:26:00.221 { 00:26:00.221 "name": "BaseBdev3", 00:26:00.221 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:00.221 "is_configured": true, 00:26:00.221 "data_offset": 2048, 00:26:00.221 "data_size": 63488 00:26:00.221 }, 00:26:00.221 { 00:26:00.221 "name": "BaseBdev4", 00:26:00.221 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:00.221 "is_configured": true, 00:26:00.221 "data_offset": 2048, 00:26:00.221 "data_size": 63488 00:26:00.221 } 00:26:00.221 ] 00:26:00.221 }' 00:26:00.221 09:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.221 [2024-07-15 09:30:09.068887] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:00.221 [2024-07-15 09:30:09.069788] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=953 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.221 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.480 [2024-07-15 09:30:09.292060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:00.480 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.480 "name": "raid_bdev1", 00:26:00.480 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:00.480 "strip_size_kb": 0, 00:26:00.480 "state": "online", 00:26:00.480 "raid_level": "raid1", 00:26:00.480 "superblock": true, 00:26:00.480 "num_base_bdevs": 4, 00:26:00.480 "num_base_bdevs_discovered": 3, 00:26:00.480 "num_base_bdevs_operational": 3, 00:26:00.480 "process": { 00:26:00.480 "type": "rebuild", 00:26:00.480 "target": "spare", 00:26:00.480 "progress": { 00:26:00.480 "blocks": 28672, 00:26:00.480 "percent": 45 00:26:00.480 } 00:26:00.480 }, 00:26:00.480 "base_bdevs_list": [ 00:26:00.480 { 00:26:00.480 "name": "spare", 00:26:00.480 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:00.480 "is_configured": true, 00:26:00.480 "data_offset": 2048, 00:26:00.480 "data_size": 63488 00:26:00.480 }, 00:26:00.480 { 00:26:00.480 "name": null, 00:26:00.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.480 "is_configured": false, 00:26:00.480 "data_offset": 2048, 00:26:00.480 "data_size": 63488 00:26:00.480 }, 00:26:00.480 { 00:26:00.480 "name": "BaseBdev3", 00:26:00.480 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:00.480 "is_configured": true, 00:26:00.480 "data_offset": 2048, 00:26:00.480 "data_size": 63488 00:26:00.480 }, 00:26:00.480 { 00:26:00.480 "name": "BaseBdev4", 00:26:00.480 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:00.480 "is_configured": true, 00:26:00.480 "data_offset": 2048, 00:26:00.480 "data_size": 63488 00:26:00.480 } 00:26:00.480 ] 00:26:00.480 }' 00:26:00.480 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.480 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.480 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.739 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.739 09:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:00.739 [2024-07-15 09:30:09.642986] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:01.307 [2024-07-15 09:30:10.202513] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:26:01.575 [2024-07-15 09:30:10.441136] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.576 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.836 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.836 "name": "raid_bdev1", 00:26:01.836 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:01.836 "strip_size_kb": 0, 00:26:01.836 "state": "online", 00:26:01.836 "raid_level": "raid1", 00:26:01.836 "superblock": true, 00:26:01.836 "num_base_bdevs": 4, 00:26:01.836 "num_base_bdevs_discovered": 3, 00:26:01.836 "num_base_bdevs_operational": 3, 00:26:01.836 "process": { 00:26:01.836 "type": "rebuild", 00:26:01.836 "target": "spare", 00:26:01.836 "progress": { 00:26:01.836 "blocks": 49152, 00:26:01.836 "percent": 77 00:26:01.836 } 00:26:01.836 }, 00:26:01.836 "base_bdevs_list": [ 00:26:01.836 { 00:26:01.836 "name": "spare", 00:26:01.836 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:01.836 "is_configured": true, 00:26:01.836 "data_offset": 2048, 00:26:01.836 "data_size": 63488 00:26:01.836 }, 00:26:01.836 { 00:26:01.836 "name": null, 00:26:01.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.836 "is_configured": false, 00:26:01.836 "data_offset": 2048, 00:26:01.836 "data_size": 63488 00:26:01.836 }, 00:26:01.836 { 00:26:01.836 "name": "BaseBdev3", 00:26:01.836 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:01.836 "is_configured": true, 00:26:01.836 "data_offset": 2048, 00:26:01.836 "data_size": 63488 00:26:01.836 }, 00:26:01.836 { 00:26:01.836 "name": "BaseBdev4", 00:26:01.836 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:01.836 "is_configured": true, 00:26:01.836 "data_offset": 2048, 00:26:01.836 "data_size": 63488 00:26:01.836 } 00:26:01.836 ] 00:26:01.836 }' 00:26:01.836 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:01.836 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:01.836 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.094 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:02.094 09:30:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:02.353 [2024-07-15 09:30:11.222543] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:02.611 [2024-07-15 09:30:11.563987] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:02.870 [2024-07-15 09:30:11.672258] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:02.870 [2024-07-15 09:30:11.674216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.870 09:30:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.128 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:03.128 "name": "raid_bdev1", 00:26:03.128 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:03.128 "strip_size_kb": 0, 00:26:03.128 "state": "online", 00:26:03.128 "raid_level": "raid1", 00:26:03.128 "superblock": true, 00:26:03.128 "num_base_bdevs": 4, 00:26:03.128 "num_base_bdevs_discovered": 3, 00:26:03.128 "num_base_bdevs_operational": 3, 00:26:03.128 "base_bdevs_list": [ 00:26:03.128 { 00:26:03.128 "name": "spare", 00:26:03.128 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:03.128 "is_configured": true, 00:26:03.128 "data_offset": 2048, 00:26:03.128 "data_size": 63488 00:26:03.128 }, 00:26:03.128 { 00:26:03.128 "name": null, 00:26:03.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.129 "is_configured": false, 00:26:03.129 "data_offset": 2048, 00:26:03.129 "data_size": 63488 00:26:03.129 }, 00:26:03.129 { 00:26:03.129 "name": "BaseBdev3", 00:26:03.129 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:03.129 "is_configured": true, 00:26:03.129 "data_offset": 2048, 00:26:03.129 "data_size": 63488 00:26:03.129 }, 00:26:03.129 { 00:26:03.129 "name": "BaseBdev4", 00:26:03.129 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:03.129 "is_configured": true, 00:26:03.129 "data_offset": 2048, 00:26:03.129 "data_size": 63488 00:26:03.129 } 00:26:03.129 ] 00:26:03.129 }' 00:26:03.129 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.387 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:03.646 "name": "raid_bdev1", 00:26:03.646 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:03.646 "strip_size_kb": 0, 00:26:03.646 "state": "online", 00:26:03.646 "raid_level": "raid1", 00:26:03.646 "superblock": true, 00:26:03.646 "num_base_bdevs": 4, 00:26:03.646 "num_base_bdevs_discovered": 3, 00:26:03.646 "num_base_bdevs_operational": 3, 00:26:03.646 "base_bdevs_list": [ 00:26:03.646 { 00:26:03.646 "name": "spare", 00:26:03.646 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:03.646 "is_configured": true, 00:26:03.646 "data_offset": 2048, 00:26:03.646 "data_size": 63488 00:26:03.646 }, 00:26:03.646 { 00:26:03.646 "name": null, 00:26:03.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.646 "is_configured": false, 00:26:03.646 "data_offset": 2048, 00:26:03.646 "data_size": 63488 00:26:03.646 }, 00:26:03.646 { 00:26:03.646 "name": "BaseBdev3", 00:26:03.646 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:03.646 "is_configured": true, 00:26:03.646 "data_offset": 2048, 00:26:03.646 "data_size": 63488 00:26:03.646 }, 00:26:03.646 { 00:26:03.646 "name": "BaseBdev4", 00:26:03.646 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:03.646 "is_configured": true, 00:26:03.646 "data_offset": 2048, 00:26:03.646 "data_size": 63488 00:26:03.646 } 00:26:03.646 ] 00:26:03.646 }' 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.646 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.905 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.905 "name": "raid_bdev1", 00:26:03.905 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:03.905 "strip_size_kb": 0, 00:26:03.905 "state": "online", 00:26:03.905 "raid_level": "raid1", 00:26:03.905 "superblock": true, 00:26:03.905 "num_base_bdevs": 4, 00:26:03.905 "num_base_bdevs_discovered": 3, 00:26:03.905 "num_base_bdevs_operational": 3, 00:26:03.905 "base_bdevs_list": [ 00:26:03.905 { 00:26:03.905 "name": "spare", 00:26:03.905 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:03.905 "is_configured": true, 00:26:03.905 "data_offset": 2048, 00:26:03.905 "data_size": 63488 00:26:03.905 }, 00:26:03.905 { 00:26:03.905 "name": null, 00:26:03.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.905 "is_configured": false, 00:26:03.905 "data_offset": 2048, 00:26:03.905 "data_size": 63488 00:26:03.905 }, 00:26:03.905 { 00:26:03.905 "name": "BaseBdev3", 00:26:03.905 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:03.905 "is_configured": true, 00:26:03.905 "data_offset": 2048, 00:26:03.905 "data_size": 63488 00:26:03.905 }, 00:26:03.905 { 00:26:03.905 "name": "BaseBdev4", 00:26:03.905 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:03.905 "is_configured": true, 00:26:03.905 "data_offset": 2048, 00:26:03.905 "data_size": 63488 00:26:03.905 } 00:26:03.905 ] 00:26:03.905 }' 00:26:03.905 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.905 09:30:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:04.471 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:04.731 [2024-07-15 09:30:13.566070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:04.731 [2024-07-15 09:30:13.566109] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:04.731 00:26:04.731 Latency(us) 00:26:04.731 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:04.731 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:04.731 raid_bdev1 : 11.53 93.17 279.50 0.00 0.00 14396.84 283.16 123093.70 00:26:04.731 =================================================================================================================== 00:26:04.731 Total : 93.17 279.50 0.00 0.00 14396.84 283.16 123093.70 00:26:04.731 [2024-07-15 09:30:13.670378] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.731 [2024-07-15 09:30:13.670411] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:04.731 [2024-07-15 09:30:13.670502] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:04.731 [2024-07-15 09:30:13.670516] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa408a0 name raid_bdev1, state offline 00:26:04.731 0 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:05.000 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:05.001 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:05.001 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.001 09:30:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:05.259 /dev/nbd0 00:26:05.259 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.518 1+0 records in 00:26:05.518 1+0 records out 00:26:05.518 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292257 s, 14.0 MB/s 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.518 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:05.776 /dev/nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.776 1+0 records in 00:26:05.776 1+0 records out 00:26:05.776 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029435 s, 13.9 MB/s 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:05.776 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:06.035 09:30:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:06.293 /dev/nbd1 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:06.293 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:06.551 1+0 records in 00:26:06.551 1+0 records out 00:26:06.551 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260101 s, 15.7 MB/s 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:06.551 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:06.552 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:06.552 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:06.552 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:06.552 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:06.552 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:06.810 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:06.811 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:06.811 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:06.811 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:07.070 09:30:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:07.328 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:07.625 [2024-07-15 09:30:16.466900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:07.625 [2024-07-15 09:30:16.466954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:07.625 [2024-07-15 09:30:16.466981] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x90eae0 00:26:07.625 [2024-07-15 09:30:16.466994] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:07.625 [2024-07-15 09:30:16.468642] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:07.625 [2024-07-15 09:30:16.468674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:07.625 [2024-07-15 09:30:16.468765] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:07.625 [2024-07-15 09:30:16.468793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:07.625 [2024-07-15 09:30:16.468903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:07.625 [2024-07-15 09:30:16.468991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:07.625 spare 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.625 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.625 [2024-07-15 09:30:16.569311] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa40b20 00:26:07.625 [2024-07-15 09:30:16.569330] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:07.625 [2024-07-15 09:30:16.569538] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa42dd0 00:26:07.625 [2024-07-15 09:30:16.569698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa40b20 00:26:07.625 [2024-07-15 09:30:16.569708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa40b20 00:26:07.625 [2024-07-15 09:30:16.569819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.896 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.896 "name": "raid_bdev1", 00:26:07.896 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:07.896 "strip_size_kb": 0, 00:26:07.896 "state": "online", 00:26:07.896 "raid_level": "raid1", 00:26:07.896 "superblock": true, 00:26:07.896 "num_base_bdevs": 4, 00:26:07.896 "num_base_bdevs_discovered": 3, 00:26:07.896 "num_base_bdevs_operational": 3, 00:26:07.896 "base_bdevs_list": [ 00:26:07.896 { 00:26:07.896 "name": "spare", 00:26:07.896 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:07.896 "is_configured": true, 00:26:07.896 "data_offset": 2048, 00:26:07.896 "data_size": 63488 00:26:07.896 }, 00:26:07.896 { 00:26:07.896 "name": null, 00:26:07.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.896 "is_configured": false, 00:26:07.896 "data_offset": 2048, 00:26:07.896 "data_size": 63488 00:26:07.896 }, 00:26:07.896 { 00:26:07.896 "name": "BaseBdev3", 00:26:07.896 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:07.896 "is_configured": true, 00:26:07.896 "data_offset": 2048, 00:26:07.896 "data_size": 63488 00:26:07.896 }, 00:26:07.896 { 00:26:07.896 "name": "BaseBdev4", 00:26:07.896 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:07.896 "is_configured": true, 00:26:07.896 "data_offset": 2048, 00:26:07.896 "data_size": 63488 00:26:07.896 } 00:26:07.896 ] 00:26:07.896 }' 00:26:07.896 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.896 09:30:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.464 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.723 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.723 "name": "raid_bdev1", 00:26:08.723 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:08.723 "strip_size_kb": 0, 00:26:08.723 "state": "online", 00:26:08.723 "raid_level": "raid1", 00:26:08.723 "superblock": true, 00:26:08.723 "num_base_bdevs": 4, 00:26:08.723 "num_base_bdevs_discovered": 3, 00:26:08.723 "num_base_bdevs_operational": 3, 00:26:08.723 "base_bdevs_list": [ 00:26:08.723 { 00:26:08.723 "name": "spare", 00:26:08.723 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:08.723 "is_configured": true, 00:26:08.723 "data_offset": 2048, 00:26:08.723 "data_size": 63488 00:26:08.723 }, 00:26:08.723 { 00:26:08.723 "name": null, 00:26:08.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.723 "is_configured": false, 00:26:08.723 "data_offset": 2048, 00:26:08.723 "data_size": 63488 00:26:08.723 }, 00:26:08.723 { 00:26:08.723 "name": "BaseBdev3", 00:26:08.723 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:08.723 "is_configured": true, 00:26:08.723 "data_offset": 2048, 00:26:08.723 "data_size": 63488 00:26:08.723 }, 00:26:08.723 { 00:26:08.723 "name": "BaseBdev4", 00:26:08.723 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:08.723 "is_configured": true, 00:26:08.723 "data_offset": 2048, 00:26:08.723 "data_size": 63488 00:26:08.723 } 00:26:08.723 ] 00:26:08.723 }' 00:26:08.723 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.723 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.723 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.982 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.982 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.982 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:09.240 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.240 09:30:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:09.240 [2024-07-15 09:30:18.167732] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:09.240 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:09.502 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.502 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.502 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:09.502 "name": "raid_bdev1", 00:26:09.502 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:09.502 "strip_size_kb": 0, 00:26:09.502 "state": "online", 00:26:09.502 "raid_level": "raid1", 00:26:09.502 "superblock": true, 00:26:09.502 "num_base_bdevs": 4, 00:26:09.502 "num_base_bdevs_discovered": 2, 00:26:09.502 "num_base_bdevs_operational": 2, 00:26:09.502 "base_bdevs_list": [ 00:26:09.502 { 00:26:09.502 "name": null, 00:26:09.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.502 "is_configured": false, 00:26:09.502 "data_offset": 2048, 00:26:09.502 "data_size": 63488 00:26:09.502 }, 00:26:09.502 { 00:26:09.502 "name": null, 00:26:09.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.502 "is_configured": false, 00:26:09.502 "data_offset": 2048, 00:26:09.502 "data_size": 63488 00:26:09.502 }, 00:26:09.502 { 00:26:09.502 "name": "BaseBdev3", 00:26:09.502 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:09.502 "is_configured": true, 00:26:09.502 "data_offset": 2048, 00:26:09.502 "data_size": 63488 00:26:09.502 }, 00:26:09.502 { 00:26:09.502 "name": "BaseBdev4", 00:26:09.502 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:09.502 "is_configured": true, 00:26:09.502 "data_offset": 2048, 00:26:09.502 "data_size": 63488 00:26:09.502 } 00:26:09.502 ] 00:26:09.502 }' 00:26:09.502 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:09.502 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:10.068 09:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:10.326 [2024-07-15 09:30:19.210658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:10.326 [2024-07-15 09:30:19.210816] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:10.326 [2024-07-15 09:30:19.210832] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:10.326 [2024-07-15 09:30:19.210861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:10.326 [2024-07-15 09:30:19.215334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xabede0 00:26:10.326 [2024-07-15 09:30:19.217671] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:10.326 09:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.704 "name": "raid_bdev1", 00:26:11.704 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:11.704 "strip_size_kb": 0, 00:26:11.704 "state": "online", 00:26:11.704 "raid_level": "raid1", 00:26:11.704 "superblock": true, 00:26:11.704 "num_base_bdevs": 4, 00:26:11.704 "num_base_bdevs_discovered": 3, 00:26:11.704 "num_base_bdevs_operational": 3, 00:26:11.704 "process": { 00:26:11.704 "type": "rebuild", 00:26:11.704 "target": "spare", 00:26:11.704 "progress": { 00:26:11.704 "blocks": 24576, 00:26:11.704 "percent": 38 00:26:11.704 } 00:26:11.704 }, 00:26:11.704 "base_bdevs_list": [ 00:26:11.704 { 00:26:11.704 "name": "spare", 00:26:11.704 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:11.704 "is_configured": true, 00:26:11.704 "data_offset": 2048, 00:26:11.704 "data_size": 63488 00:26:11.704 }, 00:26:11.704 { 00:26:11.704 "name": null, 00:26:11.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.704 "is_configured": false, 00:26:11.704 "data_offset": 2048, 00:26:11.704 "data_size": 63488 00:26:11.704 }, 00:26:11.704 { 00:26:11.704 "name": "BaseBdev3", 00:26:11.704 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:11.704 "is_configured": true, 00:26:11.704 "data_offset": 2048, 00:26:11.704 "data_size": 63488 00:26:11.704 }, 00:26:11.704 { 00:26:11.704 "name": "BaseBdev4", 00:26:11.704 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:11.704 "is_configured": true, 00:26:11.704 "data_offset": 2048, 00:26:11.704 "data_size": 63488 00:26:11.704 } 00:26:11.704 ] 00:26:11.704 }' 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:11.704 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:11.962 [2024-07-15 09:30:20.806058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:11.962 [2024-07-15 09:30:20.830593] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:11.962 [2024-07-15 09:30:20.830639] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.962 [2024-07-15 09:30:20.830655] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:11.962 [2024-07-15 09:30:20.830664] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.962 09:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.221 09:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.221 "name": "raid_bdev1", 00:26:12.221 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:12.221 "strip_size_kb": 0, 00:26:12.221 "state": "online", 00:26:12.221 "raid_level": "raid1", 00:26:12.221 "superblock": true, 00:26:12.221 "num_base_bdevs": 4, 00:26:12.221 "num_base_bdevs_discovered": 2, 00:26:12.221 "num_base_bdevs_operational": 2, 00:26:12.221 "base_bdevs_list": [ 00:26:12.221 { 00:26:12.221 "name": null, 00:26:12.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.221 "is_configured": false, 00:26:12.221 "data_offset": 2048, 00:26:12.221 "data_size": 63488 00:26:12.221 }, 00:26:12.221 { 00:26:12.221 "name": null, 00:26:12.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.221 "is_configured": false, 00:26:12.221 "data_offset": 2048, 00:26:12.221 "data_size": 63488 00:26:12.221 }, 00:26:12.221 { 00:26:12.221 "name": "BaseBdev3", 00:26:12.221 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:12.221 "is_configured": true, 00:26:12.221 "data_offset": 2048, 00:26:12.221 "data_size": 63488 00:26:12.221 }, 00:26:12.221 { 00:26:12.221 "name": "BaseBdev4", 00:26:12.221 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:12.221 "is_configured": true, 00:26:12.221 "data_offset": 2048, 00:26:12.221 "data_size": 63488 00:26:12.221 } 00:26:12.221 ] 00:26:12.221 }' 00:26:12.221 09:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.221 09:30:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:12.788 09:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:13.045 [2024-07-15 09:30:21.922576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:13.045 [2024-07-15 09:30:21.922629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:13.046 [2024-07-15 09:30:21.922652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa42c50 00:26:13.046 [2024-07-15 09:30:21.922665] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:13.046 [2024-07-15 09:30:21.923066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:13.046 [2024-07-15 09:30:21.923087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:13.046 [2024-07-15 09:30:21.923170] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:13.046 [2024-07-15 09:30:21.923184] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:13.046 [2024-07-15 09:30:21.923195] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:13.046 [2024-07-15 09:30:21.923216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:13.046 [2024-07-15 09:30:21.927711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa426a0 00:26:13.046 spare 00:26:13.046 [2024-07-15 09:30:21.929119] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:13.046 09:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.422 09:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.422 "name": "raid_bdev1", 00:26:14.422 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:14.422 "strip_size_kb": 0, 00:26:14.422 "state": "online", 00:26:14.422 "raid_level": "raid1", 00:26:14.422 "superblock": true, 00:26:14.422 "num_base_bdevs": 4, 00:26:14.422 "num_base_bdevs_discovered": 3, 00:26:14.422 "num_base_bdevs_operational": 3, 00:26:14.422 "process": { 00:26:14.422 "type": "rebuild", 00:26:14.422 "target": "spare", 00:26:14.422 "progress": { 00:26:14.422 "blocks": 24576, 00:26:14.422 "percent": 38 00:26:14.422 } 00:26:14.422 }, 00:26:14.422 "base_bdevs_list": [ 00:26:14.422 { 00:26:14.422 "name": "spare", 00:26:14.422 "uuid": "3a4117e6-ce22-560c-86cf-5dfb46a2abca", 00:26:14.422 "is_configured": true, 00:26:14.422 "data_offset": 2048, 00:26:14.422 "data_size": 63488 00:26:14.422 }, 00:26:14.422 { 00:26:14.422 "name": null, 00:26:14.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.422 "is_configured": false, 00:26:14.422 "data_offset": 2048, 00:26:14.422 "data_size": 63488 00:26:14.422 }, 00:26:14.422 { 00:26:14.422 "name": "BaseBdev3", 00:26:14.422 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:14.422 "is_configured": true, 00:26:14.422 "data_offset": 2048, 00:26:14.422 "data_size": 63488 00:26:14.422 }, 00:26:14.422 { 00:26:14.422 "name": "BaseBdev4", 00:26:14.422 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:14.422 "is_configured": true, 00:26:14.422 "data_offset": 2048, 00:26:14.422 "data_size": 63488 00:26:14.422 } 00:26:14.422 ] 00:26:14.422 }' 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:14.422 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:14.679 [2024-07-15 09:30:23.509586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:14.679 [2024-07-15 09:30:23.541603] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:14.679 [2024-07-15 09:30:23.541648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:14.680 [2024-07-15 09:30:23.541664] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:14.680 [2024-07-15 09:30:23.541673] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.680 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.937 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:14.937 "name": "raid_bdev1", 00:26:14.937 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:14.937 "strip_size_kb": 0, 00:26:14.937 "state": "online", 00:26:14.937 "raid_level": "raid1", 00:26:14.937 "superblock": true, 00:26:14.937 "num_base_bdevs": 4, 00:26:14.937 "num_base_bdevs_discovered": 2, 00:26:14.937 "num_base_bdevs_operational": 2, 00:26:14.937 "base_bdevs_list": [ 00:26:14.937 { 00:26:14.937 "name": null, 00:26:14.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.937 "is_configured": false, 00:26:14.937 "data_offset": 2048, 00:26:14.937 "data_size": 63488 00:26:14.937 }, 00:26:14.937 { 00:26:14.937 "name": null, 00:26:14.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.937 "is_configured": false, 00:26:14.937 "data_offset": 2048, 00:26:14.937 "data_size": 63488 00:26:14.937 }, 00:26:14.937 { 00:26:14.937 "name": "BaseBdev3", 00:26:14.937 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:14.937 "is_configured": true, 00:26:14.937 "data_offset": 2048, 00:26:14.937 "data_size": 63488 00:26:14.937 }, 00:26:14.937 { 00:26:14.937 "name": "BaseBdev4", 00:26:14.937 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:14.937 "is_configured": true, 00:26:14.937 "data_offset": 2048, 00:26:14.937 "data_size": 63488 00:26:14.937 } 00:26:14.937 ] 00:26:14.937 }' 00:26:14.937 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:14.937 09:30:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.501 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.758 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:15.758 "name": "raid_bdev1", 00:26:15.758 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:15.758 "strip_size_kb": 0, 00:26:15.758 "state": "online", 00:26:15.758 "raid_level": "raid1", 00:26:15.758 "superblock": true, 00:26:15.758 "num_base_bdevs": 4, 00:26:15.758 "num_base_bdevs_discovered": 2, 00:26:15.758 "num_base_bdevs_operational": 2, 00:26:15.758 "base_bdevs_list": [ 00:26:15.758 { 00:26:15.758 "name": null, 00:26:15.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.758 "is_configured": false, 00:26:15.758 "data_offset": 2048, 00:26:15.758 "data_size": 63488 00:26:15.758 }, 00:26:15.758 { 00:26:15.758 "name": null, 00:26:15.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.758 "is_configured": false, 00:26:15.758 "data_offset": 2048, 00:26:15.758 "data_size": 63488 00:26:15.758 }, 00:26:15.758 { 00:26:15.758 "name": "BaseBdev3", 00:26:15.758 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:15.758 "is_configured": true, 00:26:15.758 "data_offset": 2048, 00:26:15.758 "data_size": 63488 00:26:15.758 }, 00:26:15.758 { 00:26:15.758 "name": "BaseBdev4", 00:26:15.758 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:15.758 "is_configured": true, 00:26:15.758 "data_offset": 2048, 00:26:15.758 "data_size": 63488 00:26:15.758 } 00:26:15.758 ] 00:26:15.758 }' 00:26:15.758 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:16.015 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:16.015 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:16.015 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:16.015 09:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:16.272 09:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:16.530 [2024-07-15 09:30:25.235193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:16.530 [2024-07-15 09:30:25.235247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.530 [2024-07-15 09:30:25.235271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa46620 00:26:16.530 [2024-07-15 09:30:25.235284] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.530 [2024-07-15 09:30:25.235661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.530 [2024-07-15 09:30:25.235683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:16.530 [2024-07-15 09:30:25.235757] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:16.530 [2024-07-15 09:30:25.235770] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:16.530 [2024-07-15 09:30:25.235782] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:16.530 BaseBdev1 00:26:16.530 09:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.462 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.722 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.722 "name": "raid_bdev1", 00:26:17.722 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:17.722 "strip_size_kb": 0, 00:26:17.723 "state": "online", 00:26:17.723 "raid_level": "raid1", 00:26:17.723 "superblock": true, 00:26:17.723 "num_base_bdevs": 4, 00:26:17.723 "num_base_bdevs_discovered": 2, 00:26:17.723 "num_base_bdevs_operational": 2, 00:26:17.723 "base_bdevs_list": [ 00:26:17.723 { 00:26:17.723 "name": null, 00:26:17.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.723 "is_configured": false, 00:26:17.723 "data_offset": 2048, 00:26:17.723 "data_size": 63488 00:26:17.723 }, 00:26:17.723 { 00:26:17.723 "name": null, 00:26:17.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.723 "is_configured": false, 00:26:17.723 "data_offset": 2048, 00:26:17.723 "data_size": 63488 00:26:17.723 }, 00:26:17.723 { 00:26:17.723 "name": "BaseBdev3", 00:26:17.723 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:17.723 "is_configured": true, 00:26:17.723 "data_offset": 2048, 00:26:17.723 "data_size": 63488 00:26:17.723 }, 00:26:17.723 { 00:26:17.723 "name": "BaseBdev4", 00:26:17.723 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:17.723 "is_configured": true, 00:26:17.723 "data_offset": 2048, 00:26:17.723 "data_size": 63488 00:26:17.723 } 00:26:17.723 ] 00:26:17.723 }' 00:26:17.723 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.723 09:30:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.292 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.550 "name": "raid_bdev1", 00:26:18.550 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:18.550 "strip_size_kb": 0, 00:26:18.550 "state": "online", 00:26:18.550 "raid_level": "raid1", 00:26:18.550 "superblock": true, 00:26:18.550 "num_base_bdevs": 4, 00:26:18.550 "num_base_bdevs_discovered": 2, 00:26:18.550 "num_base_bdevs_operational": 2, 00:26:18.550 "base_bdevs_list": [ 00:26:18.550 { 00:26:18.550 "name": null, 00:26:18.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.550 "is_configured": false, 00:26:18.550 "data_offset": 2048, 00:26:18.550 "data_size": 63488 00:26:18.550 }, 00:26:18.550 { 00:26:18.550 "name": null, 00:26:18.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.550 "is_configured": false, 00:26:18.550 "data_offset": 2048, 00:26:18.550 "data_size": 63488 00:26:18.550 }, 00:26:18.550 { 00:26:18.550 "name": "BaseBdev3", 00:26:18.550 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:18.550 "is_configured": true, 00:26:18.550 "data_offset": 2048, 00:26:18.550 "data_size": 63488 00:26:18.550 }, 00:26:18.550 { 00:26:18.550 "name": "BaseBdev4", 00:26:18.550 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:18.550 "is_configured": true, 00:26:18.550 "data_offset": 2048, 00:26:18.550 "data_size": 63488 00:26:18.550 } 00:26:18.550 ] 00:26:18.550 }' 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:18.550 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:18.809 [2024-07-15 09:30:27.657998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:18.809 [2024-07-15 09:30:27.658140] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:18.809 [2024-07-15 09:30:27.658156] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:18.809 request: 00:26:18.809 { 00:26:18.809 "base_bdev": "BaseBdev1", 00:26:18.809 "raid_bdev": "raid_bdev1", 00:26:18.809 "method": "bdev_raid_add_base_bdev", 00:26:18.809 "req_id": 1 00:26:18.809 } 00:26:18.809 Got JSON-RPC error response 00:26:18.809 response: 00:26:18.809 { 00:26:18.809 "code": -22, 00:26:18.809 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:18.809 } 00:26:18.809 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:18.809 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:18.809 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:18.809 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:18.809 09:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.743 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.002 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.002 "name": "raid_bdev1", 00:26:20.002 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:20.002 "strip_size_kb": 0, 00:26:20.002 "state": "online", 00:26:20.002 "raid_level": "raid1", 00:26:20.002 "superblock": true, 00:26:20.002 "num_base_bdevs": 4, 00:26:20.002 "num_base_bdevs_discovered": 2, 00:26:20.002 "num_base_bdevs_operational": 2, 00:26:20.002 "base_bdevs_list": [ 00:26:20.002 { 00:26:20.002 "name": null, 00:26:20.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.002 "is_configured": false, 00:26:20.002 "data_offset": 2048, 00:26:20.002 "data_size": 63488 00:26:20.002 }, 00:26:20.002 { 00:26:20.002 "name": null, 00:26:20.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.002 "is_configured": false, 00:26:20.002 "data_offset": 2048, 00:26:20.002 "data_size": 63488 00:26:20.002 }, 00:26:20.002 { 00:26:20.002 "name": "BaseBdev3", 00:26:20.002 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:20.002 "is_configured": true, 00:26:20.002 "data_offset": 2048, 00:26:20.002 "data_size": 63488 00:26:20.002 }, 00:26:20.002 { 00:26:20.002 "name": "BaseBdev4", 00:26:20.002 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:20.002 "is_configured": true, 00:26:20.002 "data_offset": 2048, 00:26:20.002 "data_size": 63488 00:26:20.002 } 00:26:20.002 ] 00:26:20.002 }' 00:26:20.002 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.002 09:30:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.938 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.938 "name": "raid_bdev1", 00:26:20.938 "uuid": "34122278-e259-4008-baba-32325a6cb8c8", 00:26:20.938 "strip_size_kb": 0, 00:26:20.938 "state": "online", 00:26:20.938 "raid_level": "raid1", 00:26:20.938 "superblock": true, 00:26:20.938 "num_base_bdevs": 4, 00:26:20.938 "num_base_bdevs_discovered": 2, 00:26:20.938 "num_base_bdevs_operational": 2, 00:26:20.938 "base_bdevs_list": [ 00:26:20.938 { 00:26:20.938 "name": null, 00:26:20.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.938 "is_configured": false, 00:26:20.938 "data_offset": 2048, 00:26:20.938 "data_size": 63488 00:26:20.938 }, 00:26:20.938 { 00:26:20.938 "name": null, 00:26:20.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.938 "is_configured": false, 00:26:20.938 "data_offset": 2048, 00:26:20.938 "data_size": 63488 00:26:20.938 }, 00:26:20.938 { 00:26:20.938 "name": "BaseBdev3", 00:26:20.938 "uuid": "f99b3614-e846-5772-954e-f1213c56ae93", 00:26:20.938 "is_configured": true, 00:26:20.938 "data_offset": 2048, 00:26:20.938 "data_size": 63488 00:26:20.938 }, 00:26:20.938 { 00:26:20.938 "name": "BaseBdev4", 00:26:20.938 "uuid": "78feea0b-12f8-513e-987c-4b3a711e765c", 00:26:20.938 "is_configured": true, 00:26:20.938 "data_offset": 2048, 00:26:20.938 "data_size": 63488 00:26:20.939 } 00:26:20.939 ] 00:26:20.939 }' 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 217820 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 217820 ']' 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 217820 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:20.939 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 217820 00:26:21.198 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:21.198 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:21.198 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 217820' 00:26:21.198 killing process with pid 217820 00:26:21.198 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 217820 00:26:21.198 Received shutdown signal, test time was about 27.732842 seconds 00:26:21.198 00:26:21.198 Latency(us) 00:26:21.198 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:21.198 =================================================================================================================== 00:26:21.198 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:21.198 [2024-07-15 09:30:29.911011] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:21.198 [2024-07-15 09:30:29.911116] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:21.198 09:30:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 217820 00:26:21.198 [2024-07-15 09:30:29.911183] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:21.198 [2024-07-15 09:30:29.911198] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa40b20 name raid_bdev1, state offline 00:26:21.198 [2024-07-15 09:30:29.954154] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:21.457 09:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:21.457 00:26:21.457 real 0m33.507s 00:26:21.457 user 0m52.632s 00:26:21.457 sys 0m5.314s 00:26:21.457 09:30:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:21.457 09:30:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:21.457 ************************************ 00:26:21.457 END TEST raid_rebuild_test_sb_io 00:26:21.457 ************************************ 00:26:21.457 09:30:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:21.457 09:30:30 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:21.457 09:30:30 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:21.457 09:30:30 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:21.457 09:30:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:21.457 09:30:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:21.457 09:30:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:21.457 ************************************ 00:26:21.457 START TEST raid_state_function_test_sb_4k 00:26:21.457 ************************************ 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=223185 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 223185' 00:26:21.457 Process raid pid: 223185 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 223185 /var/tmp/spdk-raid.sock 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 223185 ']' 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:21.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:21.457 09:30:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:21.457 [2024-07-15 09:30:30.335443] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:26:21.457 [2024-07-15 09:30:30.335507] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:21.716 [2024-07-15 09:30:30.453126] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:21.716 [2024-07-15 09:30:30.549759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:21.716 [2024-07-15 09:30:30.609333] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:21.716 [2024-07-15 09:30:30.609370] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:22.652 [2024-07-15 09:30:31.485958] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:22.652 [2024-07-15 09:30:31.486003] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:22.652 [2024-07-15 09:30:31.486014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:22.652 [2024-07-15 09:30:31.486031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.652 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:22.910 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.910 "name": "Existed_Raid", 00:26:22.910 "uuid": "72d03499-dabb-4013-a21c-62214c184ec2", 00:26:22.910 "strip_size_kb": 0, 00:26:22.910 "state": "configuring", 00:26:22.910 "raid_level": "raid1", 00:26:22.910 "superblock": true, 00:26:22.910 "num_base_bdevs": 2, 00:26:22.910 "num_base_bdevs_discovered": 0, 00:26:22.910 "num_base_bdevs_operational": 2, 00:26:22.910 "base_bdevs_list": [ 00:26:22.910 { 00:26:22.910 "name": "BaseBdev1", 00:26:22.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.910 "is_configured": false, 00:26:22.910 "data_offset": 0, 00:26:22.910 "data_size": 0 00:26:22.910 }, 00:26:22.910 { 00:26:22.910 "name": "BaseBdev2", 00:26:22.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.910 "is_configured": false, 00:26:22.910 "data_offset": 0, 00:26:22.910 "data_size": 0 00:26:22.910 } 00:26:22.910 ] 00:26:22.910 }' 00:26:22.910 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.910 09:30:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:23.843 09:30:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:24.101 [2024-07-15 09:30:32.825321] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:24.101 [2024-07-15 09:30:32.825352] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1114a80 name Existed_Raid, state configuring 00:26:24.101 09:30:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:24.358 [2024-07-15 09:30:33.069986] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:24.358 [2024-07-15 09:30:33.070020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:24.358 [2024-07-15 09:30:33.070030] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:24.358 [2024-07-15 09:30:33.070041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:24.358 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:24.615 [2024-07-15 09:30:33.324499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:24.615 BaseBdev1 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:24.615 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:24.873 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:24.873 [ 00:26:24.873 { 00:26:24.873 "name": "BaseBdev1", 00:26:24.873 "aliases": [ 00:26:24.873 "8630a970-1e09-453f-8a90-9a95d223eed1" 00:26:24.873 ], 00:26:24.873 "product_name": "Malloc disk", 00:26:24.873 "block_size": 4096, 00:26:24.873 "num_blocks": 8192, 00:26:24.873 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:24.873 "assigned_rate_limits": { 00:26:24.873 "rw_ios_per_sec": 0, 00:26:24.873 "rw_mbytes_per_sec": 0, 00:26:24.873 "r_mbytes_per_sec": 0, 00:26:24.873 "w_mbytes_per_sec": 0 00:26:24.873 }, 00:26:24.873 "claimed": true, 00:26:24.873 "claim_type": "exclusive_write", 00:26:24.873 "zoned": false, 00:26:24.873 "supported_io_types": { 00:26:24.873 "read": true, 00:26:24.873 "write": true, 00:26:24.873 "unmap": true, 00:26:24.873 "flush": true, 00:26:24.873 "reset": true, 00:26:24.873 "nvme_admin": false, 00:26:24.873 "nvme_io": false, 00:26:24.873 "nvme_io_md": false, 00:26:24.873 "write_zeroes": true, 00:26:24.873 "zcopy": true, 00:26:24.873 "get_zone_info": false, 00:26:24.873 "zone_management": false, 00:26:24.873 "zone_append": false, 00:26:24.873 "compare": false, 00:26:24.873 "compare_and_write": false, 00:26:24.873 "abort": true, 00:26:24.873 "seek_hole": false, 00:26:24.873 "seek_data": false, 00:26:24.873 "copy": true, 00:26:24.873 "nvme_iov_md": false 00:26:24.873 }, 00:26:24.873 "memory_domains": [ 00:26:24.873 { 00:26:24.873 "dma_device_id": "system", 00:26:24.873 "dma_device_type": 1 00:26:24.873 }, 00:26:24.873 { 00:26:24.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:24.873 "dma_device_type": 2 00:26:24.873 } 00:26:24.873 ], 00:26:24.873 "driver_specific": {} 00:26:24.873 } 00:26:24.873 ] 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.131 09:30:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:25.392 09:30:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.392 "name": "Existed_Raid", 00:26:25.392 "uuid": "4352583d-c293-42af-8da2-764d415965d3", 00:26:25.392 "strip_size_kb": 0, 00:26:25.392 "state": "configuring", 00:26:25.392 "raid_level": "raid1", 00:26:25.392 "superblock": true, 00:26:25.392 "num_base_bdevs": 2, 00:26:25.392 "num_base_bdevs_discovered": 1, 00:26:25.392 "num_base_bdevs_operational": 2, 00:26:25.392 "base_bdevs_list": [ 00:26:25.392 { 00:26:25.392 "name": "BaseBdev1", 00:26:25.392 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:25.392 "is_configured": true, 00:26:25.392 "data_offset": 256, 00:26:25.392 "data_size": 7936 00:26:25.392 }, 00:26:25.392 { 00:26:25.392 "name": "BaseBdev2", 00:26:25.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.392 "is_configured": false, 00:26:25.392 "data_offset": 0, 00:26:25.392 "data_size": 0 00:26:25.392 } 00:26:25.392 ] 00:26:25.392 }' 00:26:25.392 09:30:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.392 09:30:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:26.009 09:30:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:26.009 [2024-07-15 09:30:34.908699] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:26.009 [2024-07-15 09:30:34.908740] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1114350 name Existed_Raid, state configuring 00:26:26.009 09:30:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:26.267 [2024-07-15 09:30:35.157393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:26.267 [2024-07-15 09:30:35.158875] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:26.267 [2024-07-15 09:30:35.158909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.267 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:26.525 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.525 "name": "Existed_Raid", 00:26:26.525 "uuid": "3fdc4e56-2d17-411b-a79e-4919caf6fd9e", 00:26:26.525 "strip_size_kb": 0, 00:26:26.525 "state": "configuring", 00:26:26.525 "raid_level": "raid1", 00:26:26.525 "superblock": true, 00:26:26.525 "num_base_bdevs": 2, 00:26:26.525 "num_base_bdevs_discovered": 1, 00:26:26.525 "num_base_bdevs_operational": 2, 00:26:26.525 "base_bdevs_list": [ 00:26:26.525 { 00:26:26.525 "name": "BaseBdev1", 00:26:26.525 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:26.525 "is_configured": true, 00:26:26.525 "data_offset": 256, 00:26:26.525 "data_size": 7936 00:26:26.525 }, 00:26:26.525 { 00:26:26.525 "name": "BaseBdev2", 00:26:26.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.525 "is_configured": false, 00:26:26.525 "data_offset": 0, 00:26:26.525 "data_size": 0 00:26:26.525 } 00:26:26.525 ] 00:26:26.525 }' 00:26:26.525 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.525 09:30:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:27.090 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:27.348 [2024-07-15 09:30:36.247581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:27.348 [2024-07-15 09:30:36.247730] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1115000 00:26:27.348 [2024-07-15 09:30:36.247744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:27.348 [2024-07-15 09:30:36.247914] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x102f0c0 00:26:27.348 [2024-07-15 09:30:36.248054] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1115000 00:26:27.348 [2024-07-15 09:30:36.248065] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1115000 00:26:27.348 [2024-07-15 09:30:36.248156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:27.348 BaseBdev2 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:27.348 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:27.605 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:27.864 [ 00:26:27.864 { 00:26:27.864 "name": "BaseBdev2", 00:26:27.864 "aliases": [ 00:26:27.864 "a5b39ed6-6df6-41af-800f-c42b434843ae" 00:26:27.864 ], 00:26:27.864 "product_name": "Malloc disk", 00:26:27.864 "block_size": 4096, 00:26:27.864 "num_blocks": 8192, 00:26:27.864 "uuid": "a5b39ed6-6df6-41af-800f-c42b434843ae", 00:26:27.864 "assigned_rate_limits": { 00:26:27.864 "rw_ios_per_sec": 0, 00:26:27.864 "rw_mbytes_per_sec": 0, 00:26:27.864 "r_mbytes_per_sec": 0, 00:26:27.864 "w_mbytes_per_sec": 0 00:26:27.864 }, 00:26:27.864 "claimed": true, 00:26:27.864 "claim_type": "exclusive_write", 00:26:27.864 "zoned": false, 00:26:27.864 "supported_io_types": { 00:26:27.864 "read": true, 00:26:27.864 "write": true, 00:26:27.864 "unmap": true, 00:26:27.864 "flush": true, 00:26:27.864 "reset": true, 00:26:27.864 "nvme_admin": false, 00:26:27.864 "nvme_io": false, 00:26:27.864 "nvme_io_md": false, 00:26:27.864 "write_zeroes": true, 00:26:27.864 "zcopy": true, 00:26:27.864 "get_zone_info": false, 00:26:27.864 "zone_management": false, 00:26:27.864 "zone_append": false, 00:26:27.864 "compare": false, 00:26:27.864 "compare_and_write": false, 00:26:27.864 "abort": true, 00:26:27.864 "seek_hole": false, 00:26:27.864 "seek_data": false, 00:26:27.864 "copy": true, 00:26:27.864 "nvme_iov_md": false 00:26:27.864 }, 00:26:27.864 "memory_domains": [ 00:26:27.864 { 00:26:27.864 "dma_device_id": "system", 00:26:27.864 "dma_device_type": 1 00:26:27.864 }, 00:26:27.864 { 00:26:27.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.864 "dma_device_type": 2 00:26:27.864 } 00:26:27.864 ], 00:26:27.864 "driver_specific": {} 00:26:27.864 } 00:26:27.864 ] 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.864 09:30:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:28.122 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:28.122 "name": "Existed_Raid", 00:26:28.122 "uuid": "3fdc4e56-2d17-411b-a79e-4919caf6fd9e", 00:26:28.122 "strip_size_kb": 0, 00:26:28.122 "state": "online", 00:26:28.122 "raid_level": "raid1", 00:26:28.122 "superblock": true, 00:26:28.122 "num_base_bdevs": 2, 00:26:28.122 "num_base_bdevs_discovered": 2, 00:26:28.122 "num_base_bdevs_operational": 2, 00:26:28.122 "base_bdevs_list": [ 00:26:28.122 { 00:26:28.122 "name": "BaseBdev1", 00:26:28.122 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:28.122 "is_configured": true, 00:26:28.122 "data_offset": 256, 00:26:28.122 "data_size": 7936 00:26:28.122 }, 00:26:28.122 { 00:26:28.122 "name": "BaseBdev2", 00:26:28.122 "uuid": "a5b39ed6-6df6-41af-800f-c42b434843ae", 00:26:28.122 "is_configured": true, 00:26:28.122 "data_offset": 256, 00:26:28.122 "data_size": 7936 00:26:28.122 } 00:26:28.122 ] 00:26:28.122 }' 00:26:28.122 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:28.122 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:28.689 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:28.948 [2024-07-15 09:30:37.759868] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:28.948 "name": "Existed_Raid", 00:26:28.948 "aliases": [ 00:26:28.948 "3fdc4e56-2d17-411b-a79e-4919caf6fd9e" 00:26:28.948 ], 00:26:28.948 "product_name": "Raid Volume", 00:26:28.948 "block_size": 4096, 00:26:28.948 "num_blocks": 7936, 00:26:28.948 "uuid": "3fdc4e56-2d17-411b-a79e-4919caf6fd9e", 00:26:28.948 "assigned_rate_limits": { 00:26:28.948 "rw_ios_per_sec": 0, 00:26:28.948 "rw_mbytes_per_sec": 0, 00:26:28.948 "r_mbytes_per_sec": 0, 00:26:28.948 "w_mbytes_per_sec": 0 00:26:28.948 }, 00:26:28.948 "claimed": false, 00:26:28.948 "zoned": false, 00:26:28.948 "supported_io_types": { 00:26:28.948 "read": true, 00:26:28.948 "write": true, 00:26:28.948 "unmap": false, 00:26:28.948 "flush": false, 00:26:28.948 "reset": true, 00:26:28.948 "nvme_admin": false, 00:26:28.948 "nvme_io": false, 00:26:28.948 "nvme_io_md": false, 00:26:28.948 "write_zeroes": true, 00:26:28.948 "zcopy": false, 00:26:28.948 "get_zone_info": false, 00:26:28.948 "zone_management": false, 00:26:28.948 "zone_append": false, 00:26:28.948 "compare": false, 00:26:28.948 "compare_and_write": false, 00:26:28.948 "abort": false, 00:26:28.948 "seek_hole": false, 00:26:28.948 "seek_data": false, 00:26:28.948 "copy": false, 00:26:28.948 "nvme_iov_md": false 00:26:28.948 }, 00:26:28.948 "memory_domains": [ 00:26:28.948 { 00:26:28.948 "dma_device_id": "system", 00:26:28.948 "dma_device_type": 1 00:26:28.948 }, 00:26:28.948 { 00:26:28.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.948 "dma_device_type": 2 00:26:28.948 }, 00:26:28.948 { 00:26:28.948 "dma_device_id": "system", 00:26:28.948 "dma_device_type": 1 00:26:28.948 }, 00:26:28.948 { 00:26:28.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.948 "dma_device_type": 2 00:26:28.948 } 00:26:28.948 ], 00:26:28.948 "driver_specific": { 00:26:28.948 "raid": { 00:26:28.948 "uuid": "3fdc4e56-2d17-411b-a79e-4919caf6fd9e", 00:26:28.948 "strip_size_kb": 0, 00:26:28.948 "state": "online", 00:26:28.948 "raid_level": "raid1", 00:26:28.948 "superblock": true, 00:26:28.948 "num_base_bdevs": 2, 00:26:28.948 "num_base_bdevs_discovered": 2, 00:26:28.948 "num_base_bdevs_operational": 2, 00:26:28.948 "base_bdevs_list": [ 00:26:28.948 { 00:26:28.948 "name": "BaseBdev1", 00:26:28.948 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:28.948 "is_configured": true, 00:26:28.948 "data_offset": 256, 00:26:28.948 "data_size": 7936 00:26:28.948 }, 00:26:28.948 { 00:26:28.948 "name": "BaseBdev2", 00:26:28.948 "uuid": "a5b39ed6-6df6-41af-800f-c42b434843ae", 00:26:28.948 "is_configured": true, 00:26:28.948 "data_offset": 256, 00:26:28.948 "data_size": 7936 00:26:28.948 } 00:26:28.948 ] 00:26:28.948 } 00:26:28.948 } 00:26:28.948 }' 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:28.948 BaseBdev2' 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:28.948 09:30:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:29.207 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:29.207 "name": "BaseBdev1", 00:26:29.207 "aliases": [ 00:26:29.207 "8630a970-1e09-453f-8a90-9a95d223eed1" 00:26:29.207 ], 00:26:29.207 "product_name": "Malloc disk", 00:26:29.207 "block_size": 4096, 00:26:29.207 "num_blocks": 8192, 00:26:29.207 "uuid": "8630a970-1e09-453f-8a90-9a95d223eed1", 00:26:29.207 "assigned_rate_limits": { 00:26:29.207 "rw_ios_per_sec": 0, 00:26:29.207 "rw_mbytes_per_sec": 0, 00:26:29.207 "r_mbytes_per_sec": 0, 00:26:29.207 "w_mbytes_per_sec": 0 00:26:29.207 }, 00:26:29.207 "claimed": true, 00:26:29.207 "claim_type": "exclusive_write", 00:26:29.207 "zoned": false, 00:26:29.207 "supported_io_types": { 00:26:29.207 "read": true, 00:26:29.207 "write": true, 00:26:29.207 "unmap": true, 00:26:29.207 "flush": true, 00:26:29.207 "reset": true, 00:26:29.207 "nvme_admin": false, 00:26:29.207 "nvme_io": false, 00:26:29.207 "nvme_io_md": false, 00:26:29.207 "write_zeroes": true, 00:26:29.207 "zcopy": true, 00:26:29.207 "get_zone_info": false, 00:26:29.207 "zone_management": false, 00:26:29.207 "zone_append": false, 00:26:29.207 "compare": false, 00:26:29.207 "compare_and_write": false, 00:26:29.207 "abort": true, 00:26:29.207 "seek_hole": false, 00:26:29.207 "seek_data": false, 00:26:29.207 "copy": true, 00:26:29.207 "nvme_iov_md": false 00:26:29.207 }, 00:26:29.207 "memory_domains": [ 00:26:29.207 { 00:26:29.207 "dma_device_id": "system", 00:26:29.207 "dma_device_type": 1 00:26:29.207 }, 00:26:29.207 { 00:26:29.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:29.207 "dma_device_type": 2 00:26:29.207 } 00:26:29.207 ], 00:26:29.207 "driver_specific": {} 00:26:29.207 }' 00:26:29.207 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:29.207 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:29.465 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:29.724 "name": "BaseBdev2", 00:26:29.724 "aliases": [ 00:26:29.724 "a5b39ed6-6df6-41af-800f-c42b434843ae" 00:26:29.724 ], 00:26:29.724 "product_name": "Malloc disk", 00:26:29.724 "block_size": 4096, 00:26:29.724 "num_blocks": 8192, 00:26:29.724 "uuid": "a5b39ed6-6df6-41af-800f-c42b434843ae", 00:26:29.724 "assigned_rate_limits": { 00:26:29.724 "rw_ios_per_sec": 0, 00:26:29.724 "rw_mbytes_per_sec": 0, 00:26:29.724 "r_mbytes_per_sec": 0, 00:26:29.724 "w_mbytes_per_sec": 0 00:26:29.724 }, 00:26:29.724 "claimed": true, 00:26:29.724 "claim_type": "exclusive_write", 00:26:29.724 "zoned": false, 00:26:29.724 "supported_io_types": { 00:26:29.724 "read": true, 00:26:29.724 "write": true, 00:26:29.724 "unmap": true, 00:26:29.724 "flush": true, 00:26:29.724 "reset": true, 00:26:29.724 "nvme_admin": false, 00:26:29.724 "nvme_io": false, 00:26:29.724 "nvme_io_md": false, 00:26:29.724 "write_zeroes": true, 00:26:29.724 "zcopy": true, 00:26:29.724 "get_zone_info": false, 00:26:29.724 "zone_management": false, 00:26:29.724 "zone_append": false, 00:26:29.724 "compare": false, 00:26:29.724 "compare_and_write": false, 00:26:29.724 "abort": true, 00:26:29.724 "seek_hole": false, 00:26:29.724 "seek_data": false, 00:26:29.724 "copy": true, 00:26:29.724 "nvme_iov_md": false 00:26:29.724 }, 00:26:29.724 "memory_domains": [ 00:26:29.724 { 00:26:29.724 "dma_device_id": "system", 00:26:29.724 "dma_device_type": 1 00:26:29.724 }, 00:26:29.724 { 00:26:29.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:29.724 "dma_device_type": 2 00:26:29.724 } 00:26:29.724 ], 00:26:29.724 "driver_specific": {} 00:26:29.724 }' 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:29.724 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:29.983 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:30.242 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:30.242 09:30:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:30.242 [2024-07-15 09:30:39.179406] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.501 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:30.760 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.760 "name": "Existed_Raid", 00:26:30.760 "uuid": "3fdc4e56-2d17-411b-a79e-4919caf6fd9e", 00:26:30.760 "strip_size_kb": 0, 00:26:30.760 "state": "online", 00:26:30.760 "raid_level": "raid1", 00:26:30.760 "superblock": true, 00:26:30.760 "num_base_bdevs": 2, 00:26:30.760 "num_base_bdevs_discovered": 1, 00:26:30.760 "num_base_bdevs_operational": 1, 00:26:30.760 "base_bdevs_list": [ 00:26:30.760 { 00:26:30.760 "name": null, 00:26:30.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.760 "is_configured": false, 00:26:30.760 "data_offset": 256, 00:26:30.760 "data_size": 7936 00:26:30.760 }, 00:26:30.760 { 00:26:30.760 "name": "BaseBdev2", 00:26:30.760 "uuid": "a5b39ed6-6df6-41af-800f-c42b434843ae", 00:26:30.760 "is_configured": true, 00:26:30.760 "data_offset": 256, 00:26:30.760 "data_size": 7936 00:26:30.760 } 00:26:30.760 ] 00:26:30.760 }' 00:26:30.760 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.760 09:30:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:31.328 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:31.328 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:31.328 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.328 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:31.588 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:31.588 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:31.588 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:31.588 [2024-07-15 09:30:40.512235] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:31.588 [2024-07-15 09:30:40.512321] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:31.588 [2024-07-15 09:30:40.523153] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:31.588 [2024-07-15 09:30:40.523184] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:31.588 [2024-07-15 09:30:40.523195] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1115000 name Existed_Raid, state offline 00:26:31.849 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:31.849 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:31.849 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.849 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 223185 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 223185 ']' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 223185 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 223185 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 223185' 00:26:32.108 killing process with pid 223185 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 223185 00:26:32.108 [2024-07-15 09:30:40.853528] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:32.108 09:30:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 223185 00:26:32.108 [2024-07-15 09:30:40.854423] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:32.367 09:30:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:26:32.367 00:26:32.367 real 0m10.803s 00:26:32.367 user 0m19.203s 00:26:32.367 sys 0m2.036s 00:26:32.367 09:30:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:32.367 09:30:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:32.367 ************************************ 00:26:32.367 END TEST raid_state_function_test_sb_4k 00:26:32.367 ************************************ 00:26:32.367 09:30:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:32.367 09:30:41 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:26:32.367 09:30:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:32.367 09:30:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:32.367 09:30:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:32.367 ************************************ 00:26:32.367 START TEST raid_superblock_test_4k 00:26:32.367 ************************************ 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=224793 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 224793 /var/tmp/spdk-raid.sock 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 224793 ']' 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:32.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:32.367 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:32.367 [2024-07-15 09:30:41.209813] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:26:32.368 [2024-07-15 09:30:41.209877] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid224793 ] 00:26:32.627 [2024-07-15 09:30:41.336451] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:32.627 [2024-07-15 09:30:41.444727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:32.627 [2024-07-15 09:30:41.507332] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.627 [2024-07-15 09:30:41.507367] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:32.886 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:26:33.146 malloc1 00:26:33.146 09:30:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:33.715 [2024-07-15 09:30:42.397166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:33.715 [2024-07-15 09:30:42.397214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.715 [2024-07-15 09:30:42.397234] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb58570 00:26:33.715 [2024-07-15 09:30:42.397247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.715 [2024-07-15 09:30:42.398969] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.715 [2024-07-15 09:30:42.399001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:33.715 pt1 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:33.715 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:26:33.715 malloc2 00:26:33.974 09:30:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:34.234 [2024-07-15 09:30:43.147897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:34.234 [2024-07-15 09:30:43.147948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.234 [2024-07-15 09:30:43.147968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb59970 00:26:34.234 [2024-07-15 09:30:43.147981] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.234 [2024-07-15 09:30:43.149606] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.234 [2024-07-15 09:30:43.149635] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:34.234 pt2 00:26:34.234 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:34.234 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:34.234 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:34.492 [2024-07-15 09:30:43.392589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:34.492 [2024-07-15 09:30:43.393974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:34.492 [2024-07-15 09:30:43.394130] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcfc270 00:26:34.492 [2024-07-15 09:30:43.394144] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:34.492 [2024-07-15 09:30:43.394352] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb500e0 00:26:34.492 [2024-07-15 09:30:43.394503] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcfc270 00:26:34.492 [2024-07-15 09:30:43.394513] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcfc270 00:26:34.492 [2024-07-15 09:30:43.394617] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.492 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.059 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.059 "name": "raid_bdev1", 00:26:35.059 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:35.059 "strip_size_kb": 0, 00:26:35.059 "state": "online", 00:26:35.059 "raid_level": "raid1", 00:26:35.059 "superblock": true, 00:26:35.059 "num_base_bdevs": 2, 00:26:35.059 "num_base_bdevs_discovered": 2, 00:26:35.059 "num_base_bdevs_operational": 2, 00:26:35.059 "base_bdevs_list": [ 00:26:35.059 { 00:26:35.059 "name": "pt1", 00:26:35.059 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:35.059 "is_configured": true, 00:26:35.059 "data_offset": 256, 00:26:35.059 "data_size": 7936 00:26:35.059 }, 00:26:35.059 { 00:26:35.059 "name": "pt2", 00:26:35.059 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:35.059 "is_configured": true, 00:26:35.059 "data_offset": 256, 00:26:35.059 "data_size": 7936 00:26:35.059 } 00:26:35.059 ] 00:26:35.059 }' 00:26:35.059 09:30:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.059 09:30:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:35.626 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:35.886 [2024-07-15 09:30:44.760411] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:35.886 "name": "raid_bdev1", 00:26:35.886 "aliases": [ 00:26:35.886 "b2087c21-b326-48e4-98f0-4123cae9d9ce" 00:26:35.886 ], 00:26:35.886 "product_name": "Raid Volume", 00:26:35.886 "block_size": 4096, 00:26:35.886 "num_blocks": 7936, 00:26:35.886 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:35.886 "assigned_rate_limits": { 00:26:35.886 "rw_ios_per_sec": 0, 00:26:35.886 "rw_mbytes_per_sec": 0, 00:26:35.886 "r_mbytes_per_sec": 0, 00:26:35.886 "w_mbytes_per_sec": 0 00:26:35.886 }, 00:26:35.886 "claimed": false, 00:26:35.886 "zoned": false, 00:26:35.886 "supported_io_types": { 00:26:35.886 "read": true, 00:26:35.886 "write": true, 00:26:35.886 "unmap": false, 00:26:35.886 "flush": false, 00:26:35.886 "reset": true, 00:26:35.886 "nvme_admin": false, 00:26:35.886 "nvme_io": false, 00:26:35.886 "nvme_io_md": false, 00:26:35.886 "write_zeroes": true, 00:26:35.886 "zcopy": false, 00:26:35.886 "get_zone_info": false, 00:26:35.886 "zone_management": false, 00:26:35.886 "zone_append": false, 00:26:35.886 "compare": false, 00:26:35.886 "compare_and_write": false, 00:26:35.886 "abort": false, 00:26:35.886 "seek_hole": false, 00:26:35.886 "seek_data": false, 00:26:35.886 "copy": false, 00:26:35.886 "nvme_iov_md": false 00:26:35.886 }, 00:26:35.886 "memory_domains": [ 00:26:35.886 { 00:26:35.886 "dma_device_id": "system", 00:26:35.886 "dma_device_type": 1 00:26:35.886 }, 00:26:35.886 { 00:26:35.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.886 "dma_device_type": 2 00:26:35.886 }, 00:26:35.886 { 00:26:35.886 "dma_device_id": "system", 00:26:35.886 "dma_device_type": 1 00:26:35.886 }, 00:26:35.886 { 00:26:35.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.886 "dma_device_type": 2 00:26:35.886 } 00:26:35.886 ], 00:26:35.886 "driver_specific": { 00:26:35.886 "raid": { 00:26:35.886 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:35.886 "strip_size_kb": 0, 00:26:35.886 "state": "online", 00:26:35.886 "raid_level": "raid1", 00:26:35.886 "superblock": true, 00:26:35.886 "num_base_bdevs": 2, 00:26:35.886 "num_base_bdevs_discovered": 2, 00:26:35.886 "num_base_bdevs_operational": 2, 00:26:35.886 "base_bdevs_list": [ 00:26:35.886 { 00:26:35.886 "name": "pt1", 00:26:35.886 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:35.886 "is_configured": true, 00:26:35.886 "data_offset": 256, 00:26:35.886 "data_size": 7936 00:26:35.886 }, 00:26:35.886 { 00:26:35.886 "name": "pt2", 00:26:35.886 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:35.886 "is_configured": true, 00:26:35.886 "data_offset": 256, 00:26:35.886 "data_size": 7936 00:26:35.886 } 00:26:35.886 ] 00:26:35.886 } 00:26:35.886 } 00:26:35.886 }' 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:35.886 pt2' 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:35.886 09:30:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:36.145 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:36.145 "name": "pt1", 00:26:36.145 "aliases": [ 00:26:36.145 "00000000-0000-0000-0000-000000000001" 00:26:36.145 ], 00:26:36.145 "product_name": "passthru", 00:26:36.145 "block_size": 4096, 00:26:36.145 "num_blocks": 8192, 00:26:36.145 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:36.145 "assigned_rate_limits": { 00:26:36.145 "rw_ios_per_sec": 0, 00:26:36.145 "rw_mbytes_per_sec": 0, 00:26:36.145 "r_mbytes_per_sec": 0, 00:26:36.145 "w_mbytes_per_sec": 0 00:26:36.145 }, 00:26:36.145 "claimed": true, 00:26:36.145 "claim_type": "exclusive_write", 00:26:36.145 "zoned": false, 00:26:36.145 "supported_io_types": { 00:26:36.145 "read": true, 00:26:36.145 "write": true, 00:26:36.145 "unmap": true, 00:26:36.145 "flush": true, 00:26:36.145 "reset": true, 00:26:36.145 "nvme_admin": false, 00:26:36.145 "nvme_io": false, 00:26:36.145 "nvme_io_md": false, 00:26:36.145 "write_zeroes": true, 00:26:36.145 "zcopy": true, 00:26:36.145 "get_zone_info": false, 00:26:36.145 "zone_management": false, 00:26:36.145 "zone_append": false, 00:26:36.145 "compare": false, 00:26:36.145 "compare_and_write": false, 00:26:36.145 "abort": true, 00:26:36.145 "seek_hole": false, 00:26:36.145 "seek_data": false, 00:26:36.145 "copy": true, 00:26:36.145 "nvme_iov_md": false 00:26:36.145 }, 00:26:36.145 "memory_domains": [ 00:26:36.145 { 00:26:36.145 "dma_device_id": "system", 00:26:36.145 "dma_device_type": 1 00:26:36.145 }, 00:26:36.145 { 00:26:36.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:36.145 "dma_device_type": 2 00:26:36.145 } 00:26:36.145 ], 00:26:36.145 "driver_specific": { 00:26:36.145 "passthru": { 00:26:36.145 "name": "pt1", 00:26:36.145 "base_bdev_name": "malloc1" 00:26:36.145 } 00:26:36.145 } 00:26:36.145 }' 00:26:36.145 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.404 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:36.663 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:36.921 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:36.921 "name": "pt2", 00:26:36.921 "aliases": [ 00:26:36.921 "00000000-0000-0000-0000-000000000002" 00:26:36.921 ], 00:26:36.921 "product_name": "passthru", 00:26:36.921 "block_size": 4096, 00:26:36.921 "num_blocks": 8192, 00:26:36.921 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:36.921 "assigned_rate_limits": { 00:26:36.921 "rw_ios_per_sec": 0, 00:26:36.921 "rw_mbytes_per_sec": 0, 00:26:36.921 "r_mbytes_per_sec": 0, 00:26:36.921 "w_mbytes_per_sec": 0 00:26:36.921 }, 00:26:36.921 "claimed": true, 00:26:36.921 "claim_type": "exclusive_write", 00:26:36.921 "zoned": false, 00:26:36.921 "supported_io_types": { 00:26:36.921 "read": true, 00:26:36.921 "write": true, 00:26:36.921 "unmap": true, 00:26:36.921 "flush": true, 00:26:36.921 "reset": true, 00:26:36.921 "nvme_admin": false, 00:26:36.921 "nvme_io": false, 00:26:36.921 "nvme_io_md": false, 00:26:36.921 "write_zeroes": true, 00:26:36.921 "zcopy": true, 00:26:36.921 "get_zone_info": false, 00:26:36.921 "zone_management": false, 00:26:36.921 "zone_append": false, 00:26:36.921 "compare": false, 00:26:36.921 "compare_and_write": false, 00:26:36.921 "abort": true, 00:26:36.921 "seek_hole": false, 00:26:36.921 "seek_data": false, 00:26:36.921 "copy": true, 00:26:36.921 "nvme_iov_md": false 00:26:36.921 }, 00:26:36.921 "memory_domains": [ 00:26:36.921 { 00:26:36.921 "dma_device_id": "system", 00:26:36.921 "dma_device_type": 1 00:26:36.921 }, 00:26:36.921 { 00:26:36.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:36.922 "dma_device_type": 2 00:26:36.922 } 00:26:36.922 ], 00:26:36.922 "driver_specific": { 00:26:36.922 "passthru": { 00:26:36.922 "name": "pt2", 00:26:36.922 "base_bdev_name": "malloc2" 00:26:36.922 } 00:26:36.922 } 00:26:36.922 }' 00:26:36.922 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.922 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.922 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:36.922 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.922 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:37.180 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:37.180 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:37.180 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:37.180 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:37.180 09:30:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:37.180 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:37.180 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:37.180 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:37.180 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:37.439 [2024-07-15 09:30:46.280412] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:37.439 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b2087c21-b326-48e4-98f0-4123cae9d9ce 00:26:37.439 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z b2087c21-b326-48e4-98f0-4123cae9d9ce ']' 00:26:37.439 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:37.698 [2024-07-15 09:30:46.512797] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:37.698 [2024-07-15 09:30:46.512816] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:37.698 [2024-07-15 09:30:46.512868] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:37.698 [2024-07-15 09:30:46.512934] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:37.698 [2024-07-15 09:30:46.512947] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcfc270 name raid_bdev1, state offline 00:26:37.698 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.698 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:37.957 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:37.957 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:37.957 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:37.957 09:30:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:38.216 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:38.216 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:38.476 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:38.476 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:38.734 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:38.992 [2024-07-15 09:30:47.731975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:38.992 [2024-07-15 09:30:47.733363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:38.992 [2024-07-15 09:30:47.733423] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:38.992 [2024-07-15 09:30:47.733463] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:38.992 [2024-07-15 09:30:47.733482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:38.992 [2024-07-15 09:30:47.733492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcfbff0 name raid_bdev1, state configuring 00:26:38.992 request: 00:26:38.992 { 00:26:38.992 "name": "raid_bdev1", 00:26:38.992 "raid_level": "raid1", 00:26:38.992 "base_bdevs": [ 00:26:38.992 "malloc1", 00:26:38.992 "malloc2" 00:26:38.992 ], 00:26:38.992 "superblock": false, 00:26:38.992 "method": "bdev_raid_create", 00:26:38.992 "req_id": 1 00:26:38.992 } 00:26:38.992 Got JSON-RPC error response 00:26:38.992 response: 00:26:38.992 { 00:26:38.992 "code": -17, 00:26:38.992 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:38.992 } 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.992 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:39.250 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:39.250 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:39.250 09:30:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:39.509 [2024-07-15 09:30:48.225232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:39.509 [2024-07-15 09:30:48.225275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:39.509 [2024-07-15 09:30:48.225296] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb587a0 00:26:39.509 [2024-07-15 09:30:48.225308] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:39.509 [2024-07-15 09:30:48.226937] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:39.509 [2024-07-15 09:30:48.226965] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:39.509 [2024-07-15 09:30:48.227030] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:39.509 [2024-07-15 09:30:48.227057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:39.509 pt1 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.509 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.767 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.767 "name": "raid_bdev1", 00:26:39.767 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:39.767 "strip_size_kb": 0, 00:26:39.767 "state": "configuring", 00:26:39.767 "raid_level": "raid1", 00:26:39.767 "superblock": true, 00:26:39.767 "num_base_bdevs": 2, 00:26:39.767 "num_base_bdevs_discovered": 1, 00:26:39.767 "num_base_bdevs_operational": 2, 00:26:39.767 "base_bdevs_list": [ 00:26:39.767 { 00:26:39.767 "name": "pt1", 00:26:39.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:39.767 "is_configured": true, 00:26:39.767 "data_offset": 256, 00:26:39.767 "data_size": 7936 00:26:39.767 }, 00:26:39.767 { 00:26:39.767 "name": null, 00:26:39.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:39.767 "is_configured": false, 00:26:39.767 "data_offset": 256, 00:26:39.767 "data_size": 7936 00:26:39.767 } 00:26:39.767 ] 00:26:39.767 }' 00:26:39.768 09:30:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.768 09:30:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:40.335 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:40.335 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:40.335 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:40.335 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:40.594 [2024-07-15 09:30:49.324152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:40.594 [2024-07-15 09:30:49.324199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:40.594 [2024-07-15 09:30:49.324222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf06f0 00:26:40.594 [2024-07-15 09:30:49.324235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:40.594 [2024-07-15 09:30:49.324578] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:40.594 [2024-07-15 09:30:49.324598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:40.594 [2024-07-15 09:30:49.324661] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:40.594 [2024-07-15 09:30:49.324680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:40.594 [2024-07-15 09:30:49.324781] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcf1590 00:26:40.594 [2024-07-15 09:30:49.324792] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:40.594 [2024-07-15 09:30:49.324965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb52540 00:26:40.594 [2024-07-15 09:30:49.325090] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcf1590 00:26:40.594 [2024-07-15 09:30:49.325101] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcf1590 00:26:40.594 [2024-07-15 09:30:49.325195] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:40.594 pt2 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.594 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.852 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:40.852 "name": "raid_bdev1", 00:26:40.852 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:40.852 "strip_size_kb": 0, 00:26:40.852 "state": "online", 00:26:40.852 "raid_level": "raid1", 00:26:40.852 "superblock": true, 00:26:40.852 "num_base_bdevs": 2, 00:26:40.852 "num_base_bdevs_discovered": 2, 00:26:40.852 "num_base_bdevs_operational": 2, 00:26:40.852 "base_bdevs_list": [ 00:26:40.852 { 00:26:40.852 "name": "pt1", 00:26:40.852 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:40.852 "is_configured": true, 00:26:40.852 "data_offset": 256, 00:26:40.852 "data_size": 7936 00:26:40.852 }, 00:26:40.852 { 00:26:40.852 "name": "pt2", 00:26:40.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:40.852 "is_configured": true, 00:26:40.852 "data_offset": 256, 00:26:40.852 "data_size": 7936 00:26:40.852 } 00:26:40.852 ] 00:26:40.852 }' 00:26:40.852 09:30:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:40.852 09:30:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:41.419 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:41.678 [2024-07-15 09:30:50.415291] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:41.678 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:41.678 "name": "raid_bdev1", 00:26:41.678 "aliases": [ 00:26:41.678 "b2087c21-b326-48e4-98f0-4123cae9d9ce" 00:26:41.678 ], 00:26:41.678 "product_name": "Raid Volume", 00:26:41.678 "block_size": 4096, 00:26:41.678 "num_blocks": 7936, 00:26:41.678 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:41.678 "assigned_rate_limits": { 00:26:41.678 "rw_ios_per_sec": 0, 00:26:41.678 "rw_mbytes_per_sec": 0, 00:26:41.678 "r_mbytes_per_sec": 0, 00:26:41.678 "w_mbytes_per_sec": 0 00:26:41.678 }, 00:26:41.678 "claimed": false, 00:26:41.678 "zoned": false, 00:26:41.678 "supported_io_types": { 00:26:41.678 "read": true, 00:26:41.678 "write": true, 00:26:41.678 "unmap": false, 00:26:41.678 "flush": false, 00:26:41.678 "reset": true, 00:26:41.678 "nvme_admin": false, 00:26:41.678 "nvme_io": false, 00:26:41.678 "nvme_io_md": false, 00:26:41.678 "write_zeroes": true, 00:26:41.678 "zcopy": false, 00:26:41.678 "get_zone_info": false, 00:26:41.678 "zone_management": false, 00:26:41.678 "zone_append": false, 00:26:41.678 "compare": false, 00:26:41.678 "compare_and_write": false, 00:26:41.678 "abort": false, 00:26:41.678 "seek_hole": false, 00:26:41.678 "seek_data": false, 00:26:41.678 "copy": false, 00:26:41.678 "nvme_iov_md": false 00:26:41.678 }, 00:26:41.678 "memory_domains": [ 00:26:41.678 { 00:26:41.678 "dma_device_id": "system", 00:26:41.678 "dma_device_type": 1 00:26:41.678 }, 00:26:41.678 { 00:26:41.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.678 "dma_device_type": 2 00:26:41.678 }, 00:26:41.678 { 00:26:41.678 "dma_device_id": "system", 00:26:41.678 "dma_device_type": 1 00:26:41.678 }, 00:26:41.678 { 00:26:41.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.678 "dma_device_type": 2 00:26:41.678 } 00:26:41.678 ], 00:26:41.678 "driver_specific": { 00:26:41.678 "raid": { 00:26:41.678 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:41.678 "strip_size_kb": 0, 00:26:41.678 "state": "online", 00:26:41.678 "raid_level": "raid1", 00:26:41.678 "superblock": true, 00:26:41.678 "num_base_bdevs": 2, 00:26:41.678 "num_base_bdevs_discovered": 2, 00:26:41.678 "num_base_bdevs_operational": 2, 00:26:41.678 "base_bdevs_list": [ 00:26:41.678 { 00:26:41.678 "name": "pt1", 00:26:41.678 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:41.678 "is_configured": true, 00:26:41.678 "data_offset": 256, 00:26:41.678 "data_size": 7936 00:26:41.678 }, 00:26:41.678 { 00:26:41.678 "name": "pt2", 00:26:41.678 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:41.678 "is_configured": true, 00:26:41.678 "data_offset": 256, 00:26:41.678 "data_size": 7936 00:26:41.678 } 00:26:41.678 ] 00:26:41.678 } 00:26:41.678 } 00:26:41.678 }' 00:26:41.679 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:41.679 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:41.679 pt2' 00:26:41.679 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:41.679 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:41.679 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:41.953 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:41.953 "name": "pt1", 00:26:41.953 "aliases": [ 00:26:41.953 "00000000-0000-0000-0000-000000000001" 00:26:41.953 ], 00:26:41.953 "product_name": "passthru", 00:26:41.953 "block_size": 4096, 00:26:41.953 "num_blocks": 8192, 00:26:41.953 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:41.953 "assigned_rate_limits": { 00:26:41.953 "rw_ios_per_sec": 0, 00:26:41.953 "rw_mbytes_per_sec": 0, 00:26:41.953 "r_mbytes_per_sec": 0, 00:26:41.953 "w_mbytes_per_sec": 0 00:26:41.953 }, 00:26:41.953 "claimed": true, 00:26:41.953 "claim_type": "exclusive_write", 00:26:41.953 "zoned": false, 00:26:41.953 "supported_io_types": { 00:26:41.953 "read": true, 00:26:41.953 "write": true, 00:26:41.953 "unmap": true, 00:26:41.953 "flush": true, 00:26:41.953 "reset": true, 00:26:41.953 "nvme_admin": false, 00:26:41.953 "nvme_io": false, 00:26:41.953 "nvme_io_md": false, 00:26:41.953 "write_zeroes": true, 00:26:41.953 "zcopy": true, 00:26:41.953 "get_zone_info": false, 00:26:41.953 "zone_management": false, 00:26:41.953 "zone_append": false, 00:26:41.953 "compare": false, 00:26:41.953 "compare_and_write": false, 00:26:41.953 "abort": true, 00:26:41.953 "seek_hole": false, 00:26:41.953 "seek_data": false, 00:26:41.953 "copy": true, 00:26:41.953 "nvme_iov_md": false 00:26:41.953 }, 00:26:41.953 "memory_domains": [ 00:26:41.953 { 00:26:41.953 "dma_device_id": "system", 00:26:41.953 "dma_device_type": 1 00:26:41.953 }, 00:26:41.953 { 00:26:41.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.953 "dma_device_type": 2 00:26:41.953 } 00:26:41.953 ], 00:26:41.953 "driver_specific": { 00:26:41.953 "passthru": { 00:26:41.953 "name": "pt1", 00:26:41.953 "base_bdev_name": "malloc1" 00:26:41.953 } 00:26:41.953 } 00:26:41.953 }' 00:26:41.953 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:41.953 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:41.954 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:41.954 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:41.954 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:41.954 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:41.954 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:42.259 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:42.259 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:42.259 09:30:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:42.259 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:42.259 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:42.259 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:42.259 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:42.259 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:42.517 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:42.517 "name": "pt2", 00:26:42.517 "aliases": [ 00:26:42.517 "00000000-0000-0000-0000-000000000002" 00:26:42.517 ], 00:26:42.517 "product_name": "passthru", 00:26:42.517 "block_size": 4096, 00:26:42.517 "num_blocks": 8192, 00:26:42.517 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:42.517 "assigned_rate_limits": { 00:26:42.517 "rw_ios_per_sec": 0, 00:26:42.517 "rw_mbytes_per_sec": 0, 00:26:42.517 "r_mbytes_per_sec": 0, 00:26:42.517 "w_mbytes_per_sec": 0 00:26:42.517 }, 00:26:42.517 "claimed": true, 00:26:42.517 "claim_type": "exclusive_write", 00:26:42.517 "zoned": false, 00:26:42.517 "supported_io_types": { 00:26:42.517 "read": true, 00:26:42.517 "write": true, 00:26:42.517 "unmap": true, 00:26:42.517 "flush": true, 00:26:42.517 "reset": true, 00:26:42.517 "nvme_admin": false, 00:26:42.517 "nvme_io": false, 00:26:42.517 "nvme_io_md": false, 00:26:42.517 "write_zeroes": true, 00:26:42.517 "zcopy": true, 00:26:42.517 "get_zone_info": false, 00:26:42.517 "zone_management": false, 00:26:42.517 "zone_append": false, 00:26:42.517 "compare": false, 00:26:42.517 "compare_and_write": false, 00:26:42.517 "abort": true, 00:26:42.517 "seek_hole": false, 00:26:42.517 "seek_data": false, 00:26:42.517 "copy": true, 00:26:42.517 "nvme_iov_md": false 00:26:42.517 }, 00:26:42.517 "memory_domains": [ 00:26:42.517 { 00:26:42.517 "dma_device_id": "system", 00:26:42.517 "dma_device_type": 1 00:26:42.517 }, 00:26:42.517 { 00:26:42.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:42.517 "dma_device_type": 2 00:26:42.517 } 00:26:42.517 ], 00:26:42.517 "driver_specific": { 00:26:42.517 "passthru": { 00:26:42.517 "name": "pt2", 00:26:42.517 "base_bdev_name": "malloc2" 00:26:42.517 } 00:26:42.517 } 00:26:42.517 }' 00:26:42.517 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:42.517 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:42.517 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:42.518 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:42.518 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:42.775 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:43.033 [2024-07-15 09:30:51.907232] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:43.033 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' b2087c21-b326-48e4-98f0-4123cae9d9ce '!=' b2087c21-b326-48e4-98f0-4123cae9d9ce ']' 00:26:43.033 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:43.033 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:43.033 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:43.033 09:30:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:43.293 [2024-07-15 09:30:52.155685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.293 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.552 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:43.552 "name": "raid_bdev1", 00:26:43.552 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:43.552 "strip_size_kb": 0, 00:26:43.552 "state": "online", 00:26:43.552 "raid_level": "raid1", 00:26:43.552 "superblock": true, 00:26:43.552 "num_base_bdevs": 2, 00:26:43.552 "num_base_bdevs_discovered": 1, 00:26:43.552 "num_base_bdevs_operational": 1, 00:26:43.552 "base_bdevs_list": [ 00:26:43.552 { 00:26:43.552 "name": null, 00:26:43.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.552 "is_configured": false, 00:26:43.552 "data_offset": 256, 00:26:43.552 "data_size": 7936 00:26:43.552 }, 00:26:43.552 { 00:26:43.552 "name": "pt2", 00:26:43.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:43.552 "is_configured": true, 00:26:43.552 "data_offset": 256, 00:26:43.552 "data_size": 7936 00:26:43.552 } 00:26:43.552 ] 00:26:43.552 }' 00:26:43.552 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:43.552 09:30:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:44.120 09:30:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:44.380 [2024-07-15 09:30:53.210448] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:44.380 [2024-07-15 09:30:53.210474] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:44.380 [2024-07-15 09:30:53.210524] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:44.380 [2024-07-15 09:30:53.210569] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:44.380 [2024-07-15 09:30:53.210581] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcf1590 name raid_bdev1, state offline 00:26:44.380 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.380 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:44.639 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:44.639 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:44.639 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:44.639 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:44.639 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:26:44.898 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:45.157 [2024-07-15 09:30:53.944369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:45.157 [2024-07-15 09:30:53.944414] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:45.157 [2024-07-15 09:30:53.944431] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb59160 00:26:45.157 [2024-07-15 09:30:53.944445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:45.157 [2024-07-15 09:30:53.946096] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:45.157 [2024-07-15 09:30:53.946127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:45.157 [2024-07-15 09:30:53.946196] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:45.157 [2024-07-15 09:30:53.946223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:45.157 [2024-07-15 09:30:53.946311] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb4f380 00:26:45.157 [2024-07-15 09:30:53.946322] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:45.157 [2024-07-15 09:30:53.946495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb50a80 00:26:45.157 [2024-07-15 09:30:53.946616] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb4f380 00:26:45.157 [2024-07-15 09:30:53.946626] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb4f380 00:26:45.157 [2024-07-15 09:30:53.946720] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.157 pt2 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.157 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.158 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.158 09:30:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.417 09:30:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.417 "name": "raid_bdev1", 00:26:45.417 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:45.417 "strip_size_kb": 0, 00:26:45.417 "state": "online", 00:26:45.417 "raid_level": "raid1", 00:26:45.417 "superblock": true, 00:26:45.417 "num_base_bdevs": 2, 00:26:45.417 "num_base_bdevs_discovered": 1, 00:26:45.417 "num_base_bdevs_operational": 1, 00:26:45.417 "base_bdevs_list": [ 00:26:45.417 { 00:26:45.417 "name": null, 00:26:45.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.417 "is_configured": false, 00:26:45.417 "data_offset": 256, 00:26:45.417 "data_size": 7936 00:26:45.417 }, 00:26:45.417 { 00:26:45.417 "name": "pt2", 00:26:45.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:45.417 "is_configured": true, 00:26:45.417 "data_offset": 256, 00:26:45.417 "data_size": 7936 00:26:45.417 } 00:26:45.417 ] 00:26:45.417 }' 00:26:45.417 09:30:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.417 09:30:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:45.983 09:30:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:46.241 [2024-07-15 09:30:55.027237] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:46.241 [2024-07-15 09:30:55.027261] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:46.241 [2024-07-15 09:30:55.027313] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:46.241 [2024-07-15 09:30:55.027356] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:46.241 [2024-07-15 09:30:55.027369] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb4f380 name raid_bdev1, state offline 00:26:46.241 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:46.241 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.499 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:46.500 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:46.500 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:46.500 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:46.500 [2024-07-15 09:30:55.448345] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:46.500 [2024-07-15 09:30:55.448390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:46.500 [2024-07-15 09:30:55.448407] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcfb520 00:26:46.500 [2024-07-15 09:30:55.448420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:46.500 [2024-07-15 09:30:55.450006] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:46.500 [2024-07-15 09:30:55.450035] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:46.500 [2024-07-15 09:30:55.450098] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:46.500 [2024-07-15 09:30:55.450122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:46.500 [2024-07-15 09:30:55.450215] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:46.500 [2024-07-15 09:30:55.450228] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:46.500 [2024-07-15 09:30:55.450247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb503f0 name raid_bdev1, state configuring 00:26:46.500 [2024-07-15 09:30:55.450270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:46.500 [2024-07-15 09:30:55.450329] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb522b0 00:26:46.500 [2024-07-15 09:30:55.450340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:46.500 [2024-07-15 09:30:55.450499] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb4f350 00:26:46.500 [2024-07-15 09:30:55.450618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb522b0 00:26:46.500 [2024-07-15 09:30:55.450628] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb522b0 00:26:46.500 [2024-07-15 09:30:55.450724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:46.758 pt1 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.758 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.759 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.759 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.759 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.759 "name": "raid_bdev1", 00:26:46.759 "uuid": "b2087c21-b326-48e4-98f0-4123cae9d9ce", 00:26:46.759 "strip_size_kb": 0, 00:26:46.759 "state": "online", 00:26:46.759 "raid_level": "raid1", 00:26:46.759 "superblock": true, 00:26:46.759 "num_base_bdevs": 2, 00:26:46.759 "num_base_bdevs_discovered": 1, 00:26:46.759 "num_base_bdevs_operational": 1, 00:26:46.759 "base_bdevs_list": [ 00:26:46.759 { 00:26:46.759 "name": null, 00:26:46.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.759 "is_configured": false, 00:26:46.759 "data_offset": 256, 00:26:46.759 "data_size": 7936 00:26:46.759 }, 00:26:46.759 { 00:26:46.759 "name": "pt2", 00:26:46.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:46.759 "is_configured": true, 00:26:46.759 "data_offset": 256, 00:26:46.759 "data_size": 7936 00:26:46.759 } 00:26:46.759 ] 00:26:46.759 }' 00:26:46.759 09:30:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.759 09:30:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:47.325 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:47.325 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:47.584 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:47.584 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:47.584 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:47.844 [2024-07-15 09:30:56.707914] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' b2087c21-b326-48e4-98f0-4123cae9d9ce '!=' b2087c21-b326-48e4-98f0-4123cae9d9ce ']' 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 224793 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 224793 ']' 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 224793 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 224793 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 224793' 00:26:47.844 killing process with pid 224793 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 224793 00:26:47.844 [2024-07-15 09:30:56.776490] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:47.844 [2024-07-15 09:30:56.776547] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:47.844 [2024-07-15 09:30:56.776590] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:47.844 [2024-07-15 09:30:56.776602] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb522b0 name raid_bdev1, state offline 00:26:47.844 09:30:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 224793 00:26:47.844 [2024-07-15 09:30:56.795509] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:48.103 09:30:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:26:48.103 00:26:48.103 real 0m15.868s 00:26:48.103 user 0m29.140s 00:26:48.103 sys 0m2.989s 00:26:48.103 09:30:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:48.103 09:30:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:48.103 ************************************ 00:26:48.103 END TEST raid_superblock_test_4k 00:26:48.103 ************************************ 00:26:48.363 09:30:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:48.363 09:30:57 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:26:48.363 09:30:57 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:26:48.363 09:30:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:48.363 09:30:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:48.363 09:30:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:48.363 ************************************ 00:26:48.363 START TEST raid_rebuild_test_sb_4k 00:26:48.363 ************************************ 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=227065 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 227065 /var/tmp/spdk-raid.sock 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 227065 ']' 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:48.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:48.363 09:30:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:48.363 [2024-07-15 09:30:57.169869] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:26:48.364 [2024-07-15 09:30:57.169942] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid227065 ] 00:26:48.364 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:48.364 Zero copy mechanism will not be used. 00:26:48.364 [2024-07-15 09:30:57.296139] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:48.623 [2024-07-15 09:30:57.398725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:48.623 [2024-07-15 09:30:57.459090] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:48.623 [2024-07-15 09:30:57.459119] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:49.190 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:49.190 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:49.190 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:49.190 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:26:49.449 BaseBdev1_malloc 00:26:49.449 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:49.709 [2024-07-15 09:30:58.578386] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:49.709 [2024-07-15 09:30:58.578440] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:49.709 [2024-07-15 09:30:58.578468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4fd40 00:26:49.709 [2024-07-15 09:30:58.578481] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:49.709 [2024-07-15 09:30:58.580256] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:49.709 [2024-07-15 09:30:58.580287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:49.709 BaseBdev1 00:26:49.709 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:49.709 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:26:49.968 BaseBdev2_malloc 00:26:49.968 09:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:50.228 [2024-07-15 09:30:59.077806] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:50.228 [2024-07-15 09:30:59.077851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.228 [2024-07-15 09:30:59.077879] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f50860 00:26:50.228 [2024-07-15 09:30:59.077892] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.228 [2024-07-15 09:30:59.079462] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.228 [2024-07-15 09:30:59.079492] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:50.228 BaseBdev2 00:26:50.228 09:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:50.488 spare_malloc 00:26:50.488 09:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:50.747 spare_delay 00:26:50.747 09:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:51.007 [2024-07-15 09:30:59.808346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:51.007 [2024-07-15 09:30:59.808390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.007 [2024-07-15 09:30:59.808417] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20feec0 00:26:51.007 [2024-07-15 09:30:59.808429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.007 [2024-07-15 09:30:59.810002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.007 [2024-07-15 09:30:59.810033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:51.007 spare 00:26:51.007 09:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:51.267 [2024-07-15 09:31:00.053038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:51.267 [2024-07-15 09:31:00.054386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:51.267 [2024-07-15 09:31:00.054552] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2100070 00:26:51.267 [2024-07-15 09:31:00.054565] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:51.267 [2024-07-15 09:31:00.054768] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f9490 00:26:51.267 [2024-07-15 09:31:00.054908] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2100070 00:26:51.267 [2024-07-15 09:31:00.054918] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2100070 00:26:51.267 [2024-07-15 09:31:00.055033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.267 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.526 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:51.526 "name": "raid_bdev1", 00:26:51.526 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:51.526 "strip_size_kb": 0, 00:26:51.526 "state": "online", 00:26:51.526 "raid_level": "raid1", 00:26:51.526 "superblock": true, 00:26:51.526 "num_base_bdevs": 2, 00:26:51.526 "num_base_bdevs_discovered": 2, 00:26:51.526 "num_base_bdevs_operational": 2, 00:26:51.526 "base_bdevs_list": [ 00:26:51.526 { 00:26:51.526 "name": "BaseBdev1", 00:26:51.526 "uuid": "45600a25-40ff-589a-8ba2-c55578c2b3ed", 00:26:51.526 "is_configured": true, 00:26:51.526 "data_offset": 256, 00:26:51.526 "data_size": 7936 00:26:51.526 }, 00:26:51.526 { 00:26:51.526 "name": "BaseBdev2", 00:26:51.526 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:51.526 "is_configured": true, 00:26:51.526 "data_offset": 256, 00:26:51.526 "data_size": 7936 00:26:51.526 } 00:26:51.526 ] 00:26:51.526 }' 00:26:51.526 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:51.526 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:52.093 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:52.093 09:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:52.352 [2024-07-15 09:31:01.144138] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:52.352 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:52.352 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:52.352 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:52.610 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:52.869 [2024-07-15 09:31:01.585096] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f9490 00:26:52.869 /dev/nbd0 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:52.869 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:52.869 1+0 records in 00:26:52.869 1+0 records out 00:26:52.870 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274492 s, 14.9 MB/s 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:52.870 09:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:53.806 7936+0 records in 00:26:53.806 7936+0 records out 00:26:53.806 32505856 bytes (33 MB, 31 MiB) copied, 0.747518 s, 43.5 MB/s 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:53.806 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:54.065 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.065 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:54.065 [2024-07-15 09:31:02.761271] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:54.065 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:54.065 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.065 09:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:54.065 [2024-07-15 09:31:02.989914] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.323 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.581 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.581 "name": "raid_bdev1", 00:26:54.581 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:54.581 "strip_size_kb": 0, 00:26:54.581 "state": "online", 00:26:54.581 "raid_level": "raid1", 00:26:54.581 "superblock": true, 00:26:54.581 "num_base_bdevs": 2, 00:26:54.581 "num_base_bdevs_discovered": 1, 00:26:54.581 "num_base_bdevs_operational": 1, 00:26:54.581 "base_bdevs_list": [ 00:26:54.581 { 00:26:54.581 "name": null, 00:26:54.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.581 "is_configured": false, 00:26:54.581 "data_offset": 256, 00:26:54.581 "data_size": 7936 00:26:54.581 }, 00:26:54.581 { 00:26:54.581 "name": "BaseBdev2", 00:26:54.581 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:54.581 "is_configured": true, 00:26:54.581 "data_offset": 256, 00:26:54.581 "data_size": 7936 00:26:54.581 } 00:26:54.581 ] 00:26:54.581 }' 00:26:54.581 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.581 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:55.148 09:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:55.148 [2024-07-15 09:31:04.088834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.148 [2024-07-15 09:31:04.093776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ffce0 00:26:55.148 [2024-07-15 09:31:04.095989] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:55.407 09:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.345 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.604 "name": "raid_bdev1", 00:26:56.604 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:56.604 "strip_size_kb": 0, 00:26:56.604 "state": "online", 00:26:56.604 "raid_level": "raid1", 00:26:56.604 "superblock": true, 00:26:56.604 "num_base_bdevs": 2, 00:26:56.604 "num_base_bdevs_discovered": 2, 00:26:56.604 "num_base_bdevs_operational": 2, 00:26:56.604 "process": { 00:26:56.604 "type": "rebuild", 00:26:56.604 "target": "spare", 00:26:56.604 "progress": { 00:26:56.604 "blocks": 3072, 00:26:56.604 "percent": 38 00:26:56.604 } 00:26:56.604 }, 00:26:56.604 "base_bdevs_list": [ 00:26:56.604 { 00:26:56.604 "name": "spare", 00:26:56.604 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:26:56.604 "is_configured": true, 00:26:56.604 "data_offset": 256, 00:26:56.604 "data_size": 7936 00:26:56.604 }, 00:26:56.604 { 00:26:56.604 "name": "BaseBdev2", 00:26:56.604 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:56.604 "is_configured": true, 00:26:56.604 "data_offset": 256, 00:26:56.604 "data_size": 7936 00:26:56.604 } 00:26:56.604 ] 00:26:56.604 }' 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.604 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:56.863 [2024-07-15 09:31:05.670720] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:56.863 [2024-07-15 09:31:05.708664] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:56.863 [2024-07-15 09:31:05.708713] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.863 [2024-07-15 09:31:05.708728] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:56.863 [2024-07-15 09:31:05.708737] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.863 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.122 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.122 "name": "raid_bdev1", 00:26:57.122 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:57.122 "strip_size_kb": 0, 00:26:57.122 "state": "online", 00:26:57.122 "raid_level": "raid1", 00:26:57.122 "superblock": true, 00:26:57.122 "num_base_bdevs": 2, 00:26:57.122 "num_base_bdevs_discovered": 1, 00:26:57.122 "num_base_bdevs_operational": 1, 00:26:57.122 "base_bdevs_list": [ 00:26:57.122 { 00:26:57.122 "name": null, 00:26:57.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.122 "is_configured": false, 00:26:57.122 "data_offset": 256, 00:26:57.122 "data_size": 7936 00:26:57.122 }, 00:26:57.122 { 00:26:57.122 "name": "BaseBdev2", 00:26:57.122 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:57.122 "is_configured": true, 00:26:57.122 "data_offset": 256, 00:26:57.122 "data_size": 7936 00:26:57.122 } 00:26:57.122 ] 00:26:57.122 }' 00:26:57.122 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.122 09:31:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.690 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.949 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.949 "name": "raid_bdev1", 00:26:57.949 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:57.949 "strip_size_kb": 0, 00:26:57.949 "state": "online", 00:26:57.949 "raid_level": "raid1", 00:26:57.949 "superblock": true, 00:26:57.949 "num_base_bdevs": 2, 00:26:57.949 "num_base_bdevs_discovered": 1, 00:26:57.949 "num_base_bdevs_operational": 1, 00:26:57.949 "base_bdevs_list": [ 00:26:57.949 { 00:26:57.949 "name": null, 00:26:57.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.949 "is_configured": false, 00:26:57.949 "data_offset": 256, 00:26:57.949 "data_size": 7936 00:26:57.949 }, 00:26:57.949 { 00:26:57.949 "name": "BaseBdev2", 00:26:57.949 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:57.949 "is_configured": true, 00:26:57.949 "data_offset": 256, 00:26:57.949 "data_size": 7936 00:26:57.949 } 00:26:57.949 ] 00:26:57.949 }' 00:26:57.949 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.949 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:57.949 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.207 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.207 09:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:58.207 [2024-07-15 09:31:07.132942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:58.207 [2024-07-15 09:31:07.138525] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ffce0 00:26:58.207 [2024-07-15 09:31:07.140044] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:58.207 09:31:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.625 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.625 "name": "raid_bdev1", 00:26:59.625 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:59.625 "strip_size_kb": 0, 00:26:59.625 "state": "online", 00:26:59.625 "raid_level": "raid1", 00:26:59.625 "superblock": true, 00:26:59.625 "num_base_bdevs": 2, 00:26:59.625 "num_base_bdevs_discovered": 2, 00:26:59.625 "num_base_bdevs_operational": 2, 00:26:59.625 "process": { 00:26:59.625 "type": "rebuild", 00:26:59.625 "target": "spare", 00:26:59.625 "progress": { 00:26:59.625 "blocks": 3072, 00:26:59.625 "percent": 38 00:26:59.625 } 00:26:59.625 }, 00:26:59.625 "base_bdevs_list": [ 00:26:59.625 { 00:26:59.626 "name": "spare", 00:26:59.626 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:26:59.626 "is_configured": true, 00:26:59.626 "data_offset": 256, 00:26:59.626 "data_size": 7936 00:26:59.626 }, 00:26:59.626 { 00:26:59.626 "name": "BaseBdev2", 00:26:59.626 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:59.626 "is_configured": true, 00:26:59.626 "data_offset": 256, 00:26:59.626 "data_size": 7936 00:26:59.626 } 00:26:59.626 ] 00:26:59.626 }' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:59.626 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1012 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.626 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.884 "name": "raid_bdev1", 00:26:59.884 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:26:59.884 "strip_size_kb": 0, 00:26:59.884 "state": "online", 00:26:59.884 "raid_level": "raid1", 00:26:59.884 "superblock": true, 00:26:59.884 "num_base_bdevs": 2, 00:26:59.884 "num_base_bdevs_discovered": 2, 00:26:59.884 "num_base_bdevs_operational": 2, 00:26:59.884 "process": { 00:26:59.884 "type": "rebuild", 00:26:59.884 "target": "spare", 00:26:59.884 "progress": { 00:26:59.884 "blocks": 3840, 00:26:59.884 "percent": 48 00:26:59.884 } 00:26:59.884 }, 00:26:59.884 "base_bdevs_list": [ 00:26:59.884 { 00:26:59.884 "name": "spare", 00:26:59.884 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:26:59.884 "is_configured": true, 00:26:59.884 "data_offset": 256, 00:26:59.884 "data_size": 7936 00:26:59.884 }, 00:26:59.884 { 00:26:59.884 "name": "BaseBdev2", 00:26:59.884 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:26:59.884 "is_configured": true, 00:26:59.884 "data_offset": 256, 00:26:59.884 "data_size": 7936 00:26:59.884 } 00:26:59.884 ] 00:26:59.884 }' 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:59.884 09:31:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.263 09:31:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.263 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.263 "name": "raid_bdev1", 00:27:01.263 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:01.263 "strip_size_kb": 0, 00:27:01.263 "state": "online", 00:27:01.263 "raid_level": "raid1", 00:27:01.263 "superblock": true, 00:27:01.263 "num_base_bdevs": 2, 00:27:01.263 "num_base_bdevs_discovered": 2, 00:27:01.263 "num_base_bdevs_operational": 2, 00:27:01.263 "process": { 00:27:01.263 "type": "rebuild", 00:27:01.263 "target": "spare", 00:27:01.263 "progress": { 00:27:01.264 "blocks": 7424, 00:27:01.264 "percent": 93 00:27:01.264 } 00:27:01.264 }, 00:27:01.264 "base_bdevs_list": [ 00:27:01.264 { 00:27:01.264 "name": "spare", 00:27:01.264 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:01.264 "is_configured": true, 00:27:01.264 "data_offset": 256, 00:27:01.264 "data_size": 7936 00:27:01.264 }, 00:27:01.264 { 00:27:01.264 "name": "BaseBdev2", 00:27:01.264 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:01.264 "is_configured": true, 00:27:01.264 "data_offset": 256, 00:27:01.264 "data_size": 7936 00:27:01.264 } 00:27:01.264 ] 00:27:01.264 }' 00:27:01.264 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.264 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:01.264 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.264 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:01.264 09:31:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:01.523 [2024-07-15 09:31:10.264158] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:01.523 [2024-07-15 09:31:10.264221] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:01.523 [2024-07-15 09:31:10.264303] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.475 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:02.735 "name": "raid_bdev1", 00:27:02.735 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:02.735 "strip_size_kb": 0, 00:27:02.735 "state": "online", 00:27:02.735 "raid_level": "raid1", 00:27:02.735 "superblock": true, 00:27:02.735 "num_base_bdevs": 2, 00:27:02.735 "num_base_bdevs_discovered": 2, 00:27:02.735 "num_base_bdevs_operational": 2, 00:27:02.735 "base_bdevs_list": [ 00:27:02.735 { 00:27:02.735 "name": "spare", 00:27:02.735 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:02.735 "is_configured": true, 00:27:02.735 "data_offset": 256, 00:27:02.735 "data_size": 7936 00:27:02.735 }, 00:27:02.735 { 00:27:02.735 "name": "BaseBdev2", 00:27:02.735 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:02.735 "is_configured": true, 00:27:02.735 "data_offset": 256, 00:27:02.735 "data_size": 7936 00:27:02.735 } 00:27:02.735 ] 00:27:02.735 }' 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.735 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:02.994 "name": "raid_bdev1", 00:27:02.994 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:02.994 "strip_size_kb": 0, 00:27:02.994 "state": "online", 00:27:02.994 "raid_level": "raid1", 00:27:02.994 "superblock": true, 00:27:02.994 "num_base_bdevs": 2, 00:27:02.994 "num_base_bdevs_discovered": 2, 00:27:02.994 "num_base_bdevs_operational": 2, 00:27:02.994 "base_bdevs_list": [ 00:27:02.994 { 00:27:02.994 "name": "spare", 00:27:02.994 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:02.994 "is_configured": true, 00:27:02.994 "data_offset": 256, 00:27:02.994 "data_size": 7936 00:27:02.994 }, 00:27:02.994 { 00:27:02.994 "name": "BaseBdev2", 00:27:02.994 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:02.994 "is_configured": true, 00:27:02.994 "data_offset": 256, 00:27:02.994 "data_size": 7936 00:27:02.994 } 00:27:02.994 ] 00:27:02.994 }' 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.994 09:31:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.254 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.254 "name": "raid_bdev1", 00:27:03.254 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:03.254 "strip_size_kb": 0, 00:27:03.254 "state": "online", 00:27:03.254 "raid_level": "raid1", 00:27:03.254 "superblock": true, 00:27:03.254 "num_base_bdevs": 2, 00:27:03.254 "num_base_bdevs_discovered": 2, 00:27:03.254 "num_base_bdevs_operational": 2, 00:27:03.254 "base_bdevs_list": [ 00:27:03.254 { 00:27:03.254 "name": "spare", 00:27:03.254 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:03.254 "is_configured": true, 00:27:03.254 "data_offset": 256, 00:27:03.254 "data_size": 7936 00:27:03.254 }, 00:27:03.254 { 00:27:03.254 "name": "BaseBdev2", 00:27:03.254 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:03.254 "is_configured": true, 00:27:03.254 "data_offset": 256, 00:27:03.254 "data_size": 7936 00:27:03.254 } 00:27:03.254 ] 00:27:03.254 }' 00:27:03.254 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.254 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:03.822 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:04.081 [2024-07-15 09:31:12.948687] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:04.081 [2024-07-15 09:31:12.948715] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:04.081 [2024-07-15 09:31:12.948773] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:04.081 [2024-07-15 09:31:12.948827] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:04.081 [2024-07-15 09:31:12.948839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2100070 name raid_bdev1, state offline 00:27:04.081 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.081 09:31:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:04.340 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:04.598 /dev/nbd0 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:04.598 1+0 records in 00:27:04.598 1+0 records out 00:27:04.598 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225529 s, 18.2 MB/s 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:04.598 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:04.856 /dev/nbd1 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:04.856 1+0 records in 00:27:04.856 1+0 records out 00:27:04.856 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331931 s, 12.3 MB/s 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:04.856 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:05.114 09:31:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:05.371 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:05.629 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:05.629 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:05.629 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:05.629 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:05.630 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:05.888 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:06.147 [2024-07-15 09:31:14.878692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:06.147 [2024-07-15 09:31:14.878740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:06.147 [2024-07-15 09:31:14.878765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ff500 00:27:06.147 [2024-07-15 09:31:14.878778] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:06.147 [2024-07-15 09:31:14.880406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:06.147 [2024-07-15 09:31:14.880434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:06.147 [2024-07-15 09:31:14.880513] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:06.147 [2024-07-15 09:31:14.880539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:06.147 [2024-07-15 09:31:14.880637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:06.147 spare 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:06.147 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:06.148 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.148 09:31:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.148 [2024-07-15 09:31:14.980950] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20fe260 00:27:06.148 [2024-07-15 09:31:14.980968] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:06.148 [2024-07-15 09:31:14.981164] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f9490 00:27:06.148 [2024-07-15 09:31:14.981310] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20fe260 00:27:06.148 [2024-07-15 09:31:14.981320] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20fe260 00:27:06.148 [2024-07-15 09:31:14.981424] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:06.406 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.406 "name": "raid_bdev1", 00:27:06.406 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:06.406 "strip_size_kb": 0, 00:27:06.406 "state": "online", 00:27:06.406 "raid_level": "raid1", 00:27:06.406 "superblock": true, 00:27:06.406 "num_base_bdevs": 2, 00:27:06.406 "num_base_bdevs_discovered": 2, 00:27:06.406 "num_base_bdevs_operational": 2, 00:27:06.406 "base_bdevs_list": [ 00:27:06.406 { 00:27:06.406 "name": "spare", 00:27:06.406 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:06.406 "is_configured": true, 00:27:06.406 "data_offset": 256, 00:27:06.406 "data_size": 7936 00:27:06.406 }, 00:27:06.406 { 00:27:06.406 "name": "BaseBdev2", 00:27:06.406 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:06.406 "is_configured": true, 00:27:06.406 "data_offset": 256, 00:27:06.406 "data_size": 7936 00:27:06.406 } 00:27:06.406 ] 00:27:06.406 }' 00:27:06.406 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.406 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.972 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.231 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:07.231 "name": "raid_bdev1", 00:27:07.231 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:07.231 "strip_size_kb": 0, 00:27:07.231 "state": "online", 00:27:07.231 "raid_level": "raid1", 00:27:07.231 "superblock": true, 00:27:07.231 "num_base_bdevs": 2, 00:27:07.231 "num_base_bdevs_discovered": 2, 00:27:07.231 "num_base_bdevs_operational": 2, 00:27:07.231 "base_bdevs_list": [ 00:27:07.231 { 00:27:07.231 "name": "spare", 00:27:07.231 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:07.231 "is_configured": true, 00:27:07.231 "data_offset": 256, 00:27:07.231 "data_size": 7936 00:27:07.231 }, 00:27:07.231 { 00:27:07.231 "name": "BaseBdev2", 00:27:07.231 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:07.231 "is_configured": true, 00:27:07.231 "data_offset": 256, 00:27:07.231 "data_size": 7936 00:27:07.231 } 00:27:07.231 ] 00:27:07.231 }' 00:27:07.231 09:31:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:07.231 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:07.231 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:07.231 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:07.231 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:07.231 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.490 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:07.490 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:08.056 [2024-07-15 09:31:16.791918] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.056 09:31:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.314 09:31:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.314 "name": "raid_bdev1", 00:27:08.314 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:08.314 "strip_size_kb": 0, 00:27:08.314 "state": "online", 00:27:08.314 "raid_level": "raid1", 00:27:08.314 "superblock": true, 00:27:08.314 "num_base_bdevs": 2, 00:27:08.314 "num_base_bdevs_discovered": 1, 00:27:08.314 "num_base_bdevs_operational": 1, 00:27:08.314 "base_bdevs_list": [ 00:27:08.314 { 00:27:08.314 "name": null, 00:27:08.314 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.314 "is_configured": false, 00:27:08.314 "data_offset": 256, 00:27:08.314 "data_size": 7936 00:27:08.314 }, 00:27:08.314 { 00:27:08.314 "name": "BaseBdev2", 00:27:08.314 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:08.314 "is_configured": true, 00:27:08.314 "data_offset": 256, 00:27:08.314 "data_size": 7936 00:27:08.314 } 00:27:08.314 ] 00:27:08.314 }' 00:27:08.314 09:31:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.314 09:31:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:08.879 09:31:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:09.137 [2024-07-15 09:31:17.882827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:09.137 [2024-07-15 09:31:17.882982] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:09.137 [2024-07-15 09:31:17.882999] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:09.137 [2024-07-15 09:31:17.883028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:09.137 [2024-07-15 09:31:17.887816] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f9490 00:27:09.137 [2024-07-15 09:31:17.890244] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:09.137 09:31:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.070 09:31:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:10.329 "name": "raid_bdev1", 00:27:10.329 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:10.329 "strip_size_kb": 0, 00:27:10.329 "state": "online", 00:27:10.329 "raid_level": "raid1", 00:27:10.329 "superblock": true, 00:27:10.329 "num_base_bdevs": 2, 00:27:10.329 "num_base_bdevs_discovered": 2, 00:27:10.329 "num_base_bdevs_operational": 2, 00:27:10.329 "process": { 00:27:10.329 "type": "rebuild", 00:27:10.329 "target": "spare", 00:27:10.329 "progress": { 00:27:10.329 "blocks": 3072, 00:27:10.329 "percent": 38 00:27:10.329 } 00:27:10.329 }, 00:27:10.329 "base_bdevs_list": [ 00:27:10.329 { 00:27:10.329 "name": "spare", 00:27:10.329 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:10.329 "is_configured": true, 00:27:10.329 "data_offset": 256, 00:27:10.329 "data_size": 7936 00:27:10.329 }, 00:27:10.329 { 00:27:10.329 "name": "BaseBdev2", 00:27:10.329 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:10.329 "is_configured": true, 00:27:10.329 "data_offset": 256, 00:27:10.329 "data_size": 7936 00:27:10.329 } 00:27:10.329 ] 00:27:10.329 }' 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:10.329 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:10.588 [2024-07-15 09:31:19.480601] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:10.588 [2024-07-15 09:31:19.502521] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:10.588 [2024-07-15 09:31:19.502566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:10.588 [2024-07-15 09:31:19.502581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:10.588 [2024-07-15 09:31:19.502590] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.588 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.846 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.846 "name": "raid_bdev1", 00:27:10.846 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:10.847 "strip_size_kb": 0, 00:27:10.847 "state": "online", 00:27:10.847 "raid_level": "raid1", 00:27:10.847 "superblock": true, 00:27:10.847 "num_base_bdevs": 2, 00:27:10.847 "num_base_bdevs_discovered": 1, 00:27:10.847 "num_base_bdevs_operational": 1, 00:27:10.847 "base_bdevs_list": [ 00:27:10.847 { 00:27:10.847 "name": null, 00:27:10.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.847 "is_configured": false, 00:27:10.847 "data_offset": 256, 00:27:10.847 "data_size": 7936 00:27:10.847 }, 00:27:10.847 { 00:27:10.847 "name": "BaseBdev2", 00:27:10.847 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:10.847 "is_configured": true, 00:27:10.847 "data_offset": 256, 00:27:10.847 "data_size": 7936 00:27:10.847 } 00:27:10.847 ] 00:27:10.847 }' 00:27:10.847 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.847 09:31:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:11.782 09:31:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:11.782 [2024-07-15 09:31:20.526414] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:11.782 [2024-07-15 09:31:20.526467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:11.782 [2024-07-15 09:31:20.526494] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ff730 00:27:11.782 [2024-07-15 09:31:20.526507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:11.782 [2024-07-15 09:31:20.526874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:11.782 [2024-07-15 09:31:20.526892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:11.782 [2024-07-15 09:31:20.526978] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:11.782 [2024-07-15 09:31:20.526990] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:11.782 [2024-07-15 09:31:20.527001] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:11.782 [2024-07-15 09:31:20.527019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:11.782 [2024-07-15 09:31:20.531871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2100aa0 00:27:11.782 spare 00:27:11.782 [2024-07-15 09:31:20.533334] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:11.782 09:31:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.716 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.974 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.974 "name": "raid_bdev1", 00:27:12.974 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:12.974 "strip_size_kb": 0, 00:27:12.974 "state": "online", 00:27:12.974 "raid_level": "raid1", 00:27:12.974 "superblock": true, 00:27:12.974 "num_base_bdevs": 2, 00:27:12.974 "num_base_bdevs_discovered": 2, 00:27:12.974 "num_base_bdevs_operational": 2, 00:27:12.974 "process": { 00:27:12.974 "type": "rebuild", 00:27:12.974 "target": "spare", 00:27:12.974 "progress": { 00:27:12.974 "blocks": 2816, 00:27:12.974 "percent": 35 00:27:12.974 } 00:27:12.974 }, 00:27:12.974 "base_bdevs_list": [ 00:27:12.974 { 00:27:12.974 "name": "spare", 00:27:12.974 "uuid": "730cdfa9-00be-5008-8fa8-cbd7c2ab4ad4", 00:27:12.974 "is_configured": true, 00:27:12.974 "data_offset": 256, 00:27:12.974 "data_size": 7936 00:27:12.974 }, 00:27:12.974 { 00:27:12.974 "name": "BaseBdev2", 00:27:12.975 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:12.975 "is_configured": true, 00:27:12.975 "data_offset": 256, 00:27:12.975 "data_size": 7936 00:27:12.975 } 00:27:12.975 ] 00:27:12.975 }' 00:27:12.975 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.975 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.975 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.975 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.975 09:31:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:13.234 [2024-07-15 09:31:22.044590] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:13.234 [2024-07-15 09:31:22.045039] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:13.234 [2024-07-15 09:31:22.045081] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:13.234 [2024-07-15 09:31:22.045097] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:13.234 [2024-07-15 09:31:22.045105] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.234 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.493 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.493 "name": "raid_bdev1", 00:27:13.493 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:13.493 "strip_size_kb": 0, 00:27:13.493 "state": "online", 00:27:13.493 "raid_level": "raid1", 00:27:13.493 "superblock": true, 00:27:13.493 "num_base_bdevs": 2, 00:27:13.493 "num_base_bdevs_discovered": 1, 00:27:13.493 "num_base_bdevs_operational": 1, 00:27:13.493 "base_bdevs_list": [ 00:27:13.493 { 00:27:13.493 "name": null, 00:27:13.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.493 "is_configured": false, 00:27:13.493 "data_offset": 256, 00:27:13.493 "data_size": 7936 00:27:13.493 }, 00:27:13.493 { 00:27:13.493 "name": "BaseBdev2", 00:27:13.493 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:13.493 "is_configured": true, 00:27:13.493 "data_offset": 256, 00:27:13.493 "data_size": 7936 00:27:13.493 } 00:27:13.493 ] 00:27:13.493 }' 00:27:13.493 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.493 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.059 09:31:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.317 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:14.317 "name": "raid_bdev1", 00:27:14.317 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:14.317 "strip_size_kb": 0, 00:27:14.317 "state": "online", 00:27:14.317 "raid_level": "raid1", 00:27:14.317 "superblock": true, 00:27:14.317 "num_base_bdevs": 2, 00:27:14.317 "num_base_bdevs_discovered": 1, 00:27:14.317 "num_base_bdevs_operational": 1, 00:27:14.317 "base_bdevs_list": [ 00:27:14.317 { 00:27:14.317 "name": null, 00:27:14.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.317 "is_configured": false, 00:27:14.317 "data_offset": 256, 00:27:14.317 "data_size": 7936 00:27:14.317 }, 00:27:14.317 { 00:27:14.317 "name": "BaseBdev2", 00:27:14.317 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:14.317 "is_configured": true, 00:27:14.317 "data_offset": 256, 00:27:14.317 "data_size": 7936 00:27:14.317 } 00:27:14.317 ] 00:27:14.317 }' 00:27:14.317 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.317 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:14.318 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.318 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:14.318 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:14.576 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:14.837 [2024-07-15 09:31:23.690311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:14.837 [2024-07-15 09:31:23.690360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:14.837 [2024-07-15 09:31:23.690386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fa650 00:27:14.837 [2024-07-15 09:31:23.690399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:14.837 [2024-07-15 09:31:23.690745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:14.837 [2024-07-15 09:31:23.690762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:14.837 [2024-07-15 09:31:23.690829] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:14.837 [2024-07-15 09:31:23.690842] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:14.837 [2024-07-15 09:31:23.690852] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:14.837 BaseBdev1 00:27:14.837 09:31:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:15.773 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:15.773 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.773 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.773 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.774 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.033 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.033 "name": "raid_bdev1", 00:27:16.033 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:16.033 "strip_size_kb": 0, 00:27:16.033 "state": "online", 00:27:16.033 "raid_level": "raid1", 00:27:16.033 "superblock": true, 00:27:16.033 "num_base_bdevs": 2, 00:27:16.033 "num_base_bdevs_discovered": 1, 00:27:16.033 "num_base_bdevs_operational": 1, 00:27:16.033 "base_bdevs_list": [ 00:27:16.033 { 00:27:16.033 "name": null, 00:27:16.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.033 "is_configured": false, 00:27:16.033 "data_offset": 256, 00:27:16.033 "data_size": 7936 00:27:16.033 }, 00:27:16.033 { 00:27:16.033 "name": "BaseBdev2", 00:27:16.033 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:16.033 "is_configured": true, 00:27:16.033 "data_offset": 256, 00:27:16.033 "data_size": 7936 00:27:16.033 } 00:27:16.033 ] 00:27:16.033 }' 00:27:16.033 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.033 09:31:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.974 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.974 "name": "raid_bdev1", 00:27:16.974 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:16.974 "strip_size_kb": 0, 00:27:16.974 "state": "online", 00:27:16.975 "raid_level": "raid1", 00:27:16.975 "superblock": true, 00:27:16.975 "num_base_bdevs": 2, 00:27:16.975 "num_base_bdevs_discovered": 1, 00:27:16.975 "num_base_bdevs_operational": 1, 00:27:16.975 "base_bdevs_list": [ 00:27:16.975 { 00:27:16.975 "name": null, 00:27:16.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.975 "is_configured": false, 00:27:16.975 "data_offset": 256, 00:27:16.975 "data_size": 7936 00:27:16.975 }, 00:27:16.975 { 00:27:16.975 "name": "BaseBdev2", 00:27:16.975 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:16.975 "is_configured": true, 00:27:16.975 "data_offset": 256, 00:27:16.975 "data_size": 7936 00:27:16.975 } 00:27:16.975 ] 00:27:16.975 }' 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:16.975 09:31:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:17.308 [2024-07-15 09:31:26.064626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:17.308 [2024-07-15 09:31:26.064757] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:17.308 [2024-07-15 09:31:26.064772] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:17.308 request: 00:27:17.308 { 00:27:17.308 "base_bdev": "BaseBdev1", 00:27:17.308 "raid_bdev": "raid_bdev1", 00:27:17.308 "method": "bdev_raid_add_base_bdev", 00:27:17.308 "req_id": 1 00:27:17.308 } 00:27:17.308 Got JSON-RPC error response 00:27:17.308 response: 00:27:17.308 { 00:27:17.308 "code": -22, 00:27:17.308 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:17.308 } 00:27:17.308 09:31:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:17.308 09:31:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:17.308 09:31:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:17.308 09:31:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:17.308 09:31:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.242 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.500 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.500 "name": "raid_bdev1", 00:27:18.500 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:18.500 "strip_size_kb": 0, 00:27:18.500 "state": "online", 00:27:18.500 "raid_level": "raid1", 00:27:18.500 "superblock": true, 00:27:18.500 "num_base_bdevs": 2, 00:27:18.500 "num_base_bdevs_discovered": 1, 00:27:18.500 "num_base_bdevs_operational": 1, 00:27:18.500 "base_bdevs_list": [ 00:27:18.500 { 00:27:18.500 "name": null, 00:27:18.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.500 "is_configured": false, 00:27:18.500 "data_offset": 256, 00:27:18.500 "data_size": 7936 00:27:18.500 }, 00:27:18.500 { 00:27:18.500 "name": "BaseBdev2", 00:27:18.500 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:18.500 "is_configured": true, 00:27:18.500 "data_offset": 256, 00:27:18.500 "data_size": 7936 00:27:18.500 } 00:27:18.500 ] 00:27:18.500 }' 00:27:18.500 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.500 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.067 09:31:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.326 "name": "raid_bdev1", 00:27:19.326 "uuid": "67529098-c679-4fd9-a07d-8098accd0a25", 00:27:19.326 "strip_size_kb": 0, 00:27:19.326 "state": "online", 00:27:19.326 "raid_level": "raid1", 00:27:19.326 "superblock": true, 00:27:19.326 "num_base_bdevs": 2, 00:27:19.326 "num_base_bdevs_discovered": 1, 00:27:19.326 "num_base_bdevs_operational": 1, 00:27:19.326 "base_bdevs_list": [ 00:27:19.326 { 00:27:19.326 "name": null, 00:27:19.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.326 "is_configured": false, 00:27:19.326 "data_offset": 256, 00:27:19.326 "data_size": 7936 00:27:19.326 }, 00:27:19.326 { 00:27:19.326 "name": "BaseBdev2", 00:27:19.326 "uuid": "4fb3d67e-062a-53f3-a6b4-50fd8c00376d", 00:27:19.326 "is_configured": true, 00:27:19.326 "data_offset": 256, 00:27:19.326 "data_size": 7936 00:27:19.326 } 00:27:19.326 ] 00:27:19.326 }' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 227065 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 227065 ']' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 227065 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:19.326 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 227065 00:27:19.585 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:19.585 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:19.585 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 227065' 00:27:19.585 killing process with pid 227065 00:27:19.585 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 227065 00:27:19.585 Received shutdown signal, test time was about 60.000000 seconds 00:27:19.585 00:27:19.585 Latency(us) 00:27:19.585 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:19.585 =================================================================================================================== 00:27:19.585 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:19.585 [2024-07-15 09:31:28.312956] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:19.585 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 227065 00:27:19.585 [2024-07-15 09:31:28.313044] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:19.585 [2024-07-15 09:31:28.313085] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:19.585 [2024-07-15 09:31:28.313103] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20fe260 name raid_bdev1, state offline 00:27:19.585 [2024-07-15 09:31:28.339392] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:19.844 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:27:19.845 00:27:19.845 real 0m31.444s 00:27:19.845 user 0m48.862s 00:27:19.845 sys 0m5.290s 00:27:19.845 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:19.845 09:31:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:19.845 ************************************ 00:27:19.845 END TEST raid_rebuild_test_sb_4k 00:27:19.845 ************************************ 00:27:19.845 09:31:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:19.845 09:31:28 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:27:19.845 09:31:28 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:19.845 09:31:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:19.845 09:31:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:19.845 09:31:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:19.845 ************************************ 00:27:19.845 START TEST raid_state_function_test_sb_md_separate 00:27:19.845 ************************************ 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=231562 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 231562' 00:27:19.845 Process raid pid: 231562 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 231562 /var/tmp/spdk-raid.sock 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 231562 ']' 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:19.845 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:19.845 09:31:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.845 [2024-07-15 09:31:28.685481] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:27:19.845 [2024-07-15 09:31:28.685546] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:20.103 [2024-07-15 09:31:28.813470] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:20.103 [2024-07-15 09:31:28.915138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:20.103 [2024-07-15 09:31:28.976522] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:20.103 [2024-07-15 09:31:28.976550] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:20.669 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:20.669 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:20.669 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:20.927 [2024-07-15 09:31:29.842383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:20.927 [2024-07-15 09:31:29.842425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:20.927 [2024-07-15 09:31:29.842437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:20.927 [2024-07-15 09:31:29.842448] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.927 09:31:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:21.186 09:31:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.186 "name": "Existed_Raid", 00:27:21.186 "uuid": "8f1ec69f-96db-43d0-9f73-87fdc3cdc961", 00:27:21.186 "strip_size_kb": 0, 00:27:21.186 "state": "configuring", 00:27:21.186 "raid_level": "raid1", 00:27:21.186 "superblock": true, 00:27:21.186 "num_base_bdevs": 2, 00:27:21.186 "num_base_bdevs_discovered": 0, 00:27:21.186 "num_base_bdevs_operational": 2, 00:27:21.186 "base_bdevs_list": [ 00:27:21.186 { 00:27:21.186 "name": "BaseBdev1", 00:27:21.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.186 "is_configured": false, 00:27:21.186 "data_offset": 0, 00:27:21.186 "data_size": 0 00:27:21.186 }, 00:27:21.186 { 00:27:21.186 "name": "BaseBdev2", 00:27:21.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.186 "is_configured": false, 00:27:21.186 "data_offset": 0, 00:27:21.186 "data_size": 0 00:27:21.186 } 00:27:21.186 ] 00:27:21.186 }' 00:27:21.186 09:31:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.186 09:31:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:21.753 09:31:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:22.012 [2024-07-15 09:31:30.893041] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:22.012 [2024-07-15 09:31:30.893076] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c1a80 name Existed_Raid, state configuring 00:27:22.012 09:31:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:22.579 [2024-07-15 09:31:31.394367] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:22.579 [2024-07-15 09:31:31.394401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:22.579 [2024-07-15 09:31:31.394411] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:22.579 [2024-07-15 09:31:31.394422] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:22.579 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:27:22.837 [2024-07-15 09:31:31.659051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:22.837 BaseBdev1 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:22.837 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:23.095 09:31:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:23.662 [ 00:27:23.662 { 00:27:23.662 "name": "BaseBdev1", 00:27:23.662 "aliases": [ 00:27:23.662 "2124a51e-1050-4aa0-8226-1ea3d005fab1" 00:27:23.662 ], 00:27:23.662 "product_name": "Malloc disk", 00:27:23.662 "block_size": 4096, 00:27:23.662 "num_blocks": 8192, 00:27:23.662 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:23.662 "md_size": 32, 00:27:23.662 "md_interleave": false, 00:27:23.662 "dif_type": 0, 00:27:23.662 "assigned_rate_limits": { 00:27:23.662 "rw_ios_per_sec": 0, 00:27:23.662 "rw_mbytes_per_sec": 0, 00:27:23.662 "r_mbytes_per_sec": 0, 00:27:23.662 "w_mbytes_per_sec": 0 00:27:23.662 }, 00:27:23.662 "claimed": true, 00:27:23.662 "claim_type": "exclusive_write", 00:27:23.662 "zoned": false, 00:27:23.662 "supported_io_types": { 00:27:23.662 "read": true, 00:27:23.662 "write": true, 00:27:23.662 "unmap": true, 00:27:23.662 "flush": true, 00:27:23.662 "reset": true, 00:27:23.662 "nvme_admin": false, 00:27:23.662 "nvme_io": false, 00:27:23.662 "nvme_io_md": false, 00:27:23.662 "write_zeroes": true, 00:27:23.662 "zcopy": true, 00:27:23.662 "get_zone_info": false, 00:27:23.662 "zone_management": false, 00:27:23.662 "zone_append": false, 00:27:23.662 "compare": false, 00:27:23.662 "compare_and_write": false, 00:27:23.662 "abort": true, 00:27:23.662 "seek_hole": false, 00:27:23.662 "seek_data": false, 00:27:23.662 "copy": true, 00:27:23.662 "nvme_iov_md": false 00:27:23.662 }, 00:27:23.662 "memory_domains": [ 00:27:23.662 { 00:27:23.662 "dma_device_id": "system", 00:27:23.662 "dma_device_type": 1 00:27:23.662 }, 00:27:23.662 { 00:27:23.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.662 "dma_device_type": 2 00:27:23.662 } 00:27:23.662 ], 00:27:23.663 "driver_specific": {} 00:27:23.663 } 00:27:23.663 ] 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.663 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:23.921 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.921 "name": "Existed_Raid", 00:27:23.921 "uuid": "75318e31-6912-44f7-81ec-556a75e10df1", 00:27:23.921 "strip_size_kb": 0, 00:27:23.921 "state": "configuring", 00:27:23.921 "raid_level": "raid1", 00:27:23.921 "superblock": true, 00:27:23.921 "num_base_bdevs": 2, 00:27:23.921 "num_base_bdevs_discovered": 1, 00:27:23.921 "num_base_bdevs_operational": 2, 00:27:23.921 "base_bdevs_list": [ 00:27:23.921 { 00:27:23.921 "name": "BaseBdev1", 00:27:23.921 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:23.921 "is_configured": true, 00:27:23.921 "data_offset": 256, 00:27:23.921 "data_size": 7936 00:27:23.921 }, 00:27:23.921 { 00:27:23.921 "name": "BaseBdev2", 00:27:23.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.921 "is_configured": false, 00:27:23.921 "data_offset": 0, 00:27:23.921 "data_size": 0 00:27:23.921 } 00:27:23.921 ] 00:27:23.921 }' 00:27:23.921 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.921 09:31:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:24.486 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:24.486 [2024-07-15 09:31:33.431771] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:24.486 [2024-07-15 09:31:33.431806] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c1350 name Existed_Raid, state configuring 00:27:24.744 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:24.744 [2024-07-15 09:31:33.676458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:24.744 [2024-07-15 09:31:33.677861] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:24.744 [2024-07-15 09:31:33.677891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.003 "name": "Existed_Raid", 00:27:25.003 "uuid": "2dca6015-e9cb-4f38-8ef7-7912a692c2f3", 00:27:25.003 "strip_size_kb": 0, 00:27:25.003 "state": "configuring", 00:27:25.003 "raid_level": "raid1", 00:27:25.003 "superblock": true, 00:27:25.003 "num_base_bdevs": 2, 00:27:25.003 "num_base_bdevs_discovered": 1, 00:27:25.003 "num_base_bdevs_operational": 2, 00:27:25.003 "base_bdevs_list": [ 00:27:25.003 { 00:27:25.003 "name": "BaseBdev1", 00:27:25.003 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:25.003 "is_configured": true, 00:27:25.003 "data_offset": 256, 00:27:25.003 "data_size": 7936 00:27:25.003 }, 00:27:25.003 { 00:27:25.003 "name": "BaseBdev2", 00:27:25.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.003 "is_configured": false, 00:27:25.003 "data_offset": 0, 00:27:25.003 "data_size": 0 00:27:25.003 } 00:27:25.003 ] 00:27:25.003 }' 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.003 09:31:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:25.938 [2024-07-15 09:31:34.767456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:25.938 [2024-07-15 09:31:34.767598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c3210 00:27:25.938 [2024-07-15 09:31:34.767611] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:25.938 [2024-07-15 09:31:34.767669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c2c50 00:27:25.938 [2024-07-15 09:31:34.767771] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c3210 00:27:25.938 [2024-07-15 09:31:34.767781] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13c3210 00:27:25.938 [2024-07-15 09:31:34.767846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:25.938 BaseBdev2 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:25.938 09:31:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:26.197 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:26.456 [ 00:27:26.456 { 00:27:26.456 "name": "BaseBdev2", 00:27:26.456 "aliases": [ 00:27:26.456 "e5fa6826-e36c-41b5-b0bc-3954eb5d8192" 00:27:26.456 ], 00:27:26.456 "product_name": "Malloc disk", 00:27:26.456 "block_size": 4096, 00:27:26.456 "num_blocks": 8192, 00:27:26.456 "uuid": "e5fa6826-e36c-41b5-b0bc-3954eb5d8192", 00:27:26.456 "md_size": 32, 00:27:26.456 "md_interleave": false, 00:27:26.456 "dif_type": 0, 00:27:26.456 "assigned_rate_limits": { 00:27:26.456 "rw_ios_per_sec": 0, 00:27:26.456 "rw_mbytes_per_sec": 0, 00:27:26.456 "r_mbytes_per_sec": 0, 00:27:26.456 "w_mbytes_per_sec": 0 00:27:26.456 }, 00:27:26.456 "claimed": true, 00:27:26.456 "claim_type": "exclusive_write", 00:27:26.456 "zoned": false, 00:27:26.456 "supported_io_types": { 00:27:26.456 "read": true, 00:27:26.456 "write": true, 00:27:26.456 "unmap": true, 00:27:26.456 "flush": true, 00:27:26.456 "reset": true, 00:27:26.456 "nvme_admin": false, 00:27:26.456 "nvme_io": false, 00:27:26.456 "nvme_io_md": false, 00:27:26.456 "write_zeroes": true, 00:27:26.456 "zcopy": true, 00:27:26.456 "get_zone_info": false, 00:27:26.456 "zone_management": false, 00:27:26.456 "zone_append": false, 00:27:26.456 "compare": false, 00:27:26.456 "compare_and_write": false, 00:27:26.456 "abort": true, 00:27:26.456 "seek_hole": false, 00:27:26.456 "seek_data": false, 00:27:26.456 "copy": true, 00:27:26.456 "nvme_iov_md": false 00:27:26.456 }, 00:27:26.456 "memory_domains": [ 00:27:26.456 { 00:27:26.456 "dma_device_id": "system", 00:27:26.456 "dma_device_type": 1 00:27:26.456 }, 00:27:26.456 { 00:27:26.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:26.456 "dma_device_type": 2 00:27:26.456 } 00:27:26.456 ], 00:27:26.456 "driver_specific": {} 00:27:26.456 } 00:27:26.456 ] 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.456 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:26.716 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.716 "name": "Existed_Raid", 00:27:26.716 "uuid": "2dca6015-e9cb-4f38-8ef7-7912a692c2f3", 00:27:26.716 "strip_size_kb": 0, 00:27:26.716 "state": "online", 00:27:26.716 "raid_level": "raid1", 00:27:26.716 "superblock": true, 00:27:26.716 "num_base_bdevs": 2, 00:27:26.716 "num_base_bdevs_discovered": 2, 00:27:26.716 "num_base_bdevs_operational": 2, 00:27:26.716 "base_bdevs_list": [ 00:27:26.716 { 00:27:26.716 "name": "BaseBdev1", 00:27:26.716 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:26.716 "is_configured": true, 00:27:26.716 "data_offset": 256, 00:27:26.716 "data_size": 7936 00:27:26.716 }, 00:27:26.716 { 00:27:26.716 "name": "BaseBdev2", 00:27:26.716 "uuid": "e5fa6826-e36c-41b5-b0bc-3954eb5d8192", 00:27:26.716 "is_configured": true, 00:27:26.716 "data_offset": 256, 00:27:26.716 "data_size": 7936 00:27:26.716 } 00:27:26.716 ] 00:27:26.716 }' 00:27:26.716 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.716 09:31:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:27.285 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:27.544 [2024-07-15 09:31:36.355971] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:27.544 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:27.544 "name": "Existed_Raid", 00:27:27.544 "aliases": [ 00:27:27.544 "2dca6015-e9cb-4f38-8ef7-7912a692c2f3" 00:27:27.544 ], 00:27:27.544 "product_name": "Raid Volume", 00:27:27.544 "block_size": 4096, 00:27:27.544 "num_blocks": 7936, 00:27:27.544 "uuid": "2dca6015-e9cb-4f38-8ef7-7912a692c2f3", 00:27:27.544 "md_size": 32, 00:27:27.544 "md_interleave": false, 00:27:27.544 "dif_type": 0, 00:27:27.544 "assigned_rate_limits": { 00:27:27.544 "rw_ios_per_sec": 0, 00:27:27.544 "rw_mbytes_per_sec": 0, 00:27:27.544 "r_mbytes_per_sec": 0, 00:27:27.544 "w_mbytes_per_sec": 0 00:27:27.544 }, 00:27:27.544 "claimed": false, 00:27:27.544 "zoned": false, 00:27:27.544 "supported_io_types": { 00:27:27.544 "read": true, 00:27:27.544 "write": true, 00:27:27.544 "unmap": false, 00:27:27.544 "flush": false, 00:27:27.544 "reset": true, 00:27:27.544 "nvme_admin": false, 00:27:27.544 "nvme_io": false, 00:27:27.544 "nvme_io_md": false, 00:27:27.544 "write_zeroes": true, 00:27:27.544 "zcopy": false, 00:27:27.544 "get_zone_info": false, 00:27:27.544 "zone_management": false, 00:27:27.544 "zone_append": false, 00:27:27.544 "compare": false, 00:27:27.544 "compare_and_write": false, 00:27:27.544 "abort": false, 00:27:27.544 "seek_hole": false, 00:27:27.544 "seek_data": false, 00:27:27.544 "copy": false, 00:27:27.544 "nvme_iov_md": false 00:27:27.545 }, 00:27:27.545 "memory_domains": [ 00:27:27.545 { 00:27:27.545 "dma_device_id": "system", 00:27:27.545 "dma_device_type": 1 00:27:27.545 }, 00:27:27.545 { 00:27:27.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.545 "dma_device_type": 2 00:27:27.545 }, 00:27:27.545 { 00:27:27.545 "dma_device_id": "system", 00:27:27.545 "dma_device_type": 1 00:27:27.545 }, 00:27:27.545 { 00:27:27.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.545 "dma_device_type": 2 00:27:27.545 } 00:27:27.545 ], 00:27:27.545 "driver_specific": { 00:27:27.545 "raid": { 00:27:27.545 "uuid": "2dca6015-e9cb-4f38-8ef7-7912a692c2f3", 00:27:27.545 "strip_size_kb": 0, 00:27:27.545 "state": "online", 00:27:27.545 "raid_level": "raid1", 00:27:27.545 "superblock": true, 00:27:27.545 "num_base_bdevs": 2, 00:27:27.545 "num_base_bdevs_discovered": 2, 00:27:27.545 "num_base_bdevs_operational": 2, 00:27:27.545 "base_bdevs_list": [ 00:27:27.545 { 00:27:27.545 "name": "BaseBdev1", 00:27:27.545 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:27.545 "is_configured": true, 00:27:27.545 "data_offset": 256, 00:27:27.545 "data_size": 7936 00:27:27.545 }, 00:27:27.545 { 00:27:27.545 "name": "BaseBdev2", 00:27:27.545 "uuid": "e5fa6826-e36c-41b5-b0bc-3954eb5d8192", 00:27:27.545 "is_configured": true, 00:27:27.545 "data_offset": 256, 00:27:27.545 "data_size": 7936 00:27:27.545 } 00:27:27.545 ] 00:27:27.545 } 00:27:27.545 } 00:27:27.545 }' 00:27:27.545 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:27.545 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:27.545 BaseBdev2' 00:27:27.545 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:27.545 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:27.545 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:27.804 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:27.804 "name": "BaseBdev1", 00:27:27.804 "aliases": [ 00:27:27.804 "2124a51e-1050-4aa0-8226-1ea3d005fab1" 00:27:27.804 ], 00:27:27.804 "product_name": "Malloc disk", 00:27:27.804 "block_size": 4096, 00:27:27.804 "num_blocks": 8192, 00:27:27.804 "uuid": "2124a51e-1050-4aa0-8226-1ea3d005fab1", 00:27:27.804 "md_size": 32, 00:27:27.804 "md_interleave": false, 00:27:27.804 "dif_type": 0, 00:27:27.804 "assigned_rate_limits": { 00:27:27.804 "rw_ios_per_sec": 0, 00:27:27.804 "rw_mbytes_per_sec": 0, 00:27:27.804 "r_mbytes_per_sec": 0, 00:27:27.804 "w_mbytes_per_sec": 0 00:27:27.804 }, 00:27:27.804 "claimed": true, 00:27:27.804 "claim_type": "exclusive_write", 00:27:27.804 "zoned": false, 00:27:27.804 "supported_io_types": { 00:27:27.804 "read": true, 00:27:27.804 "write": true, 00:27:27.804 "unmap": true, 00:27:27.804 "flush": true, 00:27:27.804 "reset": true, 00:27:27.804 "nvme_admin": false, 00:27:27.804 "nvme_io": false, 00:27:27.804 "nvme_io_md": false, 00:27:27.804 "write_zeroes": true, 00:27:27.804 "zcopy": true, 00:27:27.804 "get_zone_info": false, 00:27:27.804 "zone_management": false, 00:27:27.804 "zone_append": false, 00:27:27.804 "compare": false, 00:27:27.804 "compare_and_write": false, 00:27:27.804 "abort": true, 00:27:27.804 "seek_hole": false, 00:27:27.804 "seek_data": false, 00:27:27.804 "copy": true, 00:27:27.804 "nvme_iov_md": false 00:27:27.804 }, 00:27:27.804 "memory_domains": [ 00:27:27.804 { 00:27:27.804 "dma_device_id": "system", 00:27:27.804 "dma_device_type": 1 00:27:27.804 }, 00:27:27.804 { 00:27:27.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.804 "dma_device_type": 2 00:27:27.804 } 00:27:27.804 ], 00:27:27.804 "driver_specific": {} 00:27:27.804 }' 00:27:27.804 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:27.804 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.063 09:31:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:28.323 "name": "BaseBdev2", 00:27:28.323 "aliases": [ 00:27:28.323 "e5fa6826-e36c-41b5-b0bc-3954eb5d8192" 00:27:28.323 ], 00:27:28.323 "product_name": "Malloc disk", 00:27:28.323 "block_size": 4096, 00:27:28.323 "num_blocks": 8192, 00:27:28.323 "uuid": "e5fa6826-e36c-41b5-b0bc-3954eb5d8192", 00:27:28.323 "md_size": 32, 00:27:28.323 "md_interleave": false, 00:27:28.323 "dif_type": 0, 00:27:28.323 "assigned_rate_limits": { 00:27:28.323 "rw_ios_per_sec": 0, 00:27:28.323 "rw_mbytes_per_sec": 0, 00:27:28.323 "r_mbytes_per_sec": 0, 00:27:28.323 "w_mbytes_per_sec": 0 00:27:28.323 }, 00:27:28.323 "claimed": true, 00:27:28.323 "claim_type": "exclusive_write", 00:27:28.323 "zoned": false, 00:27:28.323 "supported_io_types": { 00:27:28.323 "read": true, 00:27:28.323 "write": true, 00:27:28.323 "unmap": true, 00:27:28.323 "flush": true, 00:27:28.323 "reset": true, 00:27:28.323 "nvme_admin": false, 00:27:28.323 "nvme_io": false, 00:27:28.323 "nvme_io_md": false, 00:27:28.323 "write_zeroes": true, 00:27:28.323 "zcopy": true, 00:27:28.323 "get_zone_info": false, 00:27:28.323 "zone_management": false, 00:27:28.323 "zone_append": false, 00:27:28.323 "compare": false, 00:27:28.323 "compare_and_write": false, 00:27:28.323 "abort": true, 00:27:28.323 "seek_hole": false, 00:27:28.323 "seek_data": false, 00:27:28.323 "copy": true, 00:27:28.323 "nvme_iov_md": false 00:27:28.323 }, 00:27:28.323 "memory_domains": [ 00:27:28.323 { 00:27:28.323 "dma_device_id": "system", 00:27:28.323 "dma_device_type": 1 00:27:28.323 }, 00:27:28.323 { 00:27:28.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.323 "dma_device_type": 2 00:27:28.323 } 00:27:28.323 ], 00:27:28.323 "driver_specific": {} 00:27:28.323 }' 00:27:28.323 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.582 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.842 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:28.842 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.842 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.842 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:28.842 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:29.101 [2024-07-15 09:31:37.843787] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.101 09:31:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:29.360 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.360 "name": "Existed_Raid", 00:27:29.360 "uuid": "2dca6015-e9cb-4f38-8ef7-7912a692c2f3", 00:27:29.360 "strip_size_kb": 0, 00:27:29.360 "state": "online", 00:27:29.360 "raid_level": "raid1", 00:27:29.360 "superblock": true, 00:27:29.360 "num_base_bdevs": 2, 00:27:29.360 "num_base_bdevs_discovered": 1, 00:27:29.360 "num_base_bdevs_operational": 1, 00:27:29.360 "base_bdevs_list": [ 00:27:29.360 { 00:27:29.360 "name": null, 00:27:29.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.360 "is_configured": false, 00:27:29.360 "data_offset": 256, 00:27:29.360 "data_size": 7936 00:27:29.360 }, 00:27:29.360 { 00:27:29.360 "name": "BaseBdev2", 00:27:29.360 "uuid": "e5fa6826-e36c-41b5-b0bc-3954eb5d8192", 00:27:29.360 "is_configured": true, 00:27:29.360 "data_offset": 256, 00:27:29.360 "data_size": 7936 00:27:29.360 } 00:27:29.360 ] 00:27:29.360 }' 00:27:29.360 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.360 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:29.928 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:29.928 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:29.928 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:29.928 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.186 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:30.186 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:30.186 09:31:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:30.186 [2024-07-15 09:31:39.071526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:30.186 [2024-07-15 09:31:39.071607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:30.186 [2024-07-15 09:31:39.082961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:30.186 [2024-07-15 09:31:39.082996] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:30.186 [2024-07-15 09:31:39.083007] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c3210 name Existed_Raid, state offline 00:27:30.186 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:30.186 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:30.187 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.187 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 231562 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 231562 ']' 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 231562 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:30.447 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 231562 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 231562' 00:27:30.770 killing process with pid 231562 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 231562 00:27:30.770 [2024-07-15 09:31:39.410623] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 231562 00:27:30.770 [2024-07-15 09:31:39.411513] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:27:30.770 00:27:30.770 real 0m11.014s 00:27:30.770 user 0m19.545s 00:27:30.770 sys 0m2.085s 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:30.770 09:31:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:30.770 ************************************ 00:27:30.770 END TEST raid_state_function_test_sb_md_separate 00:27:30.770 ************************************ 00:27:30.770 09:31:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:30.770 09:31:39 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:27:30.770 09:31:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:30.770 09:31:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:30.770 09:31:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:30.770 ************************************ 00:27:30.770 START TEST raid_superblock_test_md_separate 00:27:30.770 ************************************ 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=233204 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 233204 /var/tmp/spdk-raid.sock 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:30.770 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 233204 ']' 00:27:31.029 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:31.029 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:31.029 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:31.029 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:31.029 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:31.029 09:31:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:31.029 [2024-07-15 09:31:39.781550] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:27:31.029 [2024-07-15 09:31:39.781612] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233204 ] 00:27:31.029 [2024-07-15 09:31:39.909059] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:31.287 [2024-07-15 09:31:40.008734] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:31.287 [2024-07-15 09:31:40.075395] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:31.287 [2024-07-15 09:31:40.075428] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:31.855 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:27:32.114 malloc1 00:27:32.114 09:31:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:32.373 [2024-07-15 09:31:41.188558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:32.373 [2024-07-15 09:31:41.188605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.373 [2024-07-15 09:31:41.188625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ad830 00:27:32.373 [2024-07-15 09:31:41.188638] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.373 [2024-07-15 09:31:41.190180] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.373 [2024-07-15 09:31:41.190208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:32.373 pt1 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:32.373 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:27:32.632 malloc2 00:27:32.632 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:32.891 [2024-07-15 09:31:41.683397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:32.891 [2024-07-15 09:31:41.683445] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.891 [2024-07-15 09:31:41.683464] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x239f250 00:27:32.891 [2024-07-15 09:31:41.683478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.891 [2024-07-15 09:31:41.684945] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.891 [2024-07-15 09:31:41.684972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:32.892 pt2 00:27:32.892 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:32.892 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:32.892 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:33.151 [2024-07-15 09:31:41.928063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:33.151 [2024-07-15 09:31:41.929480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:33.151 [2024-07-15 09:31:41.929630] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x239fd20 00:27:33.151 [2024-07-15 09:31:41.929643] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:33.151 [2024-07-15 09:31:41.929717] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2393a60 00:27:33.151 [2024-07-15 09:31:41.929833] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x239fd20 00:27:33.151 [2024-07-15 09:31:41.929843] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x239fd20 00:27:33.151 [2024-07-15 09:31:41.929917] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.151 09:31:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.415 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.415 "name": "raid_bdev1", 00:27:33.415 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:33.415 "strip_size_kb": 0, 00:27:33.415 "state": "online", 00:27:33.415 "raid_level": "raid1", 00:27:33.415 "superblock": true, 00:27:33.415 "num_base_bdevs": 2, 00:27:33.415 "num_base_bdevs_discovered": 2, 00:27:33.415 "num_base_bdevs_operational": 2, 00:27:33.415 "base_bdevs_list": [ 00:27:33.415 { 00:27:33.415 "name": "pt1", 00:27:33.415 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:33.415 "is_configured": true, 00:27:33.415 "data_offset": 256, 00:27:33.415 "data_size": 7936 00:27:33.415 }, 00:27:33.415 { 00:27:33.415 "name": "pt2", 00:27:33.415 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:33.415 "is_configured": true, 00:27:33.415 "data_offset": 256, 00:27:33.415 "data_size": 7936 00:27:33.415 } 00:27:33.415 ] 00:27:33.415 }' 00:27:33.415 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.415 09:31:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:33.981 09:31:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:34.239 [2024-07-15 09:31:43.019202] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:34.239 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:34.239 "name": "raid_bdev1", 00:27:34.239 "aliases": [ 00:27:34.239 "7c628d48-600d-4ef2-bb0a-aeb51599ee36" 00:27:34.239 ], 00:27:34.239 "product_name": "Raid Volume", 00:27:34.239 "block_size": 4096, 00:27:34.239 "num_blocks": 7936, 00:27:34.239 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:34.239 "md_size": 32, 00:27:34.239 "md_interleave": false, 00:27:34.239 "dif_type": 0, 00:27:34.239 "assigned_rate_limits": { 00:27:34.239 "rw_ios_per_sec": 0, 00:27:34.239 "rw_mbytes_per_sec": 0, 00:27:34.239 "r_mbytes_per_sec": 0, 00:27:34.239 "w_mbytes_per_sec": 0 00:27:34.239 }, 00:27:34.239 "claimed": false, 00:27:34.239 "zoned": false, 00:27:34.239 "supported_io_types": { 00:27:34.239 "read": true, 00:27:34.239 "write": true, 00:27:34.239 "unmap": false, 00:27:34.239 "flush": false, 00:27:34.239 "reset": true, 00:27:34.239 "nvme_admin": false, 00:27:34.239 "nvme_io": false, 00:27:34.239 "nvme_io_md": false, 00:27:34.239 "write_zeroes": true, 00:27:34.239 "zcopy": false, 00:27:34.239 "get_zone_info": false, 00:27:34.239 "zone_management": false, 00:27:34.239 "zone_append": false, 00:27:34.239 "compare": false, 00:27:34.239 "compare_and_write": false, 00:27:34.239 "abort": false, 00:27:34.239 "seek_hole": false, 00:27:34.239 "seek_data": false, 00:27:34.239 "copy": false, 00:27:34.239 "nvme_iov_md": false 00:27:34.239 }, 00:27:34.239 "memory_domains": [ 00:27:34.239 { 00:27:34.239 "dma_device_id": "system", 00:27:34.239 "dma_device_type": 1 00:27:34.239 }, 00:27:34.239 { 00:27:34.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.239 "dma_device_type": 2 00:27:34.239 }, 00:27:34.239 { 00:27:34.239 "dma_device_id": "system", 00:27:34.239 "dma_device_type": 1 00:27:34.239 }, 00:27:34.239 { 00:27:34.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.239 "dma_device_type": 2 00:27:34.239 } 00:27:34.239 ], 00:27:34.239 "driver_specific": { 00:27:34.239 "raid": { 00:27:34.239 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:34.239 "strip_size_kb": 0, 00:27:34.239 "state": "online", 00:27:34.239 "raid_level": "raid1", 00:27:34.239 "superblock": true, 00:27:34.239 "num_base_bdevs": 2, 00:27:34.239 "num_base_bdevs_discovered": 2, 00:27:34.239 "num_base_bdevs_operational": 2, 00:27:34.239 "base_bdevs_list": [ 00:27:34.239 { 00:27:34.239 "name": "pt1", 00:27:34.239 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:34.239 "is_configured": true, 00:27:34.239 "data_offset": 256, 00:27:34.240 "data_size": 7936 00:27:34.240 }, 00:27:34.240 { 00:27:34.240 "name": "pt2", 00:27:34.240 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:34.240 "is_configured": true, 00:27:34.240 "data_offset": 256, 00:27:34.240 "data_size": 7936 00:27:34.240 } 00:27:34.240 ] 00:27:34.240 } 00:27:34.240 } 00:27:34.240 }' 00:27:34.240 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:34.240 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:34.240 pt2' 00:27:34.240 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:34.240 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:34.240 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:34.497 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:34.497 "name": "pt1", 00:27:34.497 "aliases": [ 00:27:34.497 "00000000-0000-0000-0000-000000000001" 00:27:34.497 ], 00:27:34.497 "product_name": "passthru", 00:27:34.497 "block_size": 4096, 00:27:34.497 "num_blocks": 8192, 00:27:34.497 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:34.497 "md_size": 32, 00:27:34.497 "md_interleave": false, 00:27:34.497 "dif_type": 0, 00:27:34.497 "assigned_rate_limits": { 00:27:34.497 "rw_ios_per_sec": 0, 00:27:34.497 "rw_mbytes_per_sec": 0, 00:27:34.497 "r_mbytes_per_sec": 0, 00:27:34.497 "w_mbytes_per_sec": 0 00:27:34.497 }, 00:27:34.497 "claimed": true, 00:27:34.497 "claim_type": "exclusive_write", 00:27:34.497 "zoned": false, 00:27:34.497 "supported_io_types": { 00:27:34.497 "read": true, 00:27:34.497 "write": true, 00:27:34.497 "unmap": true, 00:27:34.497 "flush": true, 00:27:34.497 "reset": true, 00:27:34.497 "nvme_admin": false, 00:27:34.497 "nvme_io": false, 00:27:34.497 "nvme_io_md": false, 00:27:34.497 "write_zeroes": true, 00:27:34.497 "zcopy": true, 00:27:34.497 "get_zone_info": false, 00:27:34.497 "zone_management": false, 00:27:34.497 "zone_append": false, 00:27:34.497 "compare": false, 00:27:34.497 "compare_and_write": false, 00:27:34.497 "abort": true, 00:27:34.497 "seek_hole": false, 00:27:34.497 "seek_data": false, 00:27:34.497 "copy": true, 00:27:34.497 "nvme_iov_md": false 00:27:34.497 }, 00:27:34.497 "memory_domains": [ 00:27:34.497 { 00:27:34.497 "dma_device_id": "system", 00:27:34.497 "dma_device_type": 1 00:27:34.497 }, 00:27:34.497 { 00:27:34.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.497 "dma_device_type": 2 00:27:34.497 } 00:27:34.497 ], 00:27:34.497 "driver_specific": { 00:27:34.497 "passthru": { 00:27:34.497 "name": "pt1", 00:27:34.497 "base_bdev_name": "malloc1" 00:27:34.497 } 00:27:34.497 } 00:27:34.497 }' 00:27:34.497 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:34.497 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:34.497 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:34.497 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:34.755 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:35.012 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:35.012 "name": "pt2", 00:27:35.012 "aliases": [ 00:27:35.013 "00000000-0000-0000-0000-000000000002" 00:27:35.013 ], 00:27:35.013 "product_name": "passthru", 00:27:35.013 "block_size": 4096, 00:27:35.013 "num_blocks": 8192, 00:27:35.013 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:35.013 "md_size": 32, 00:27:35.013 "md_interleave": false, 00:27:35.013 "dif_type": 0, 00:27:35.013 "assigned_rate_limits": { 00:27:35.013 "rw_ios_per_sec": 0, 00:27:35.013 "rw_mbytes_per_sec": 0, 00:27:35.013 "r_mbytes_per_sec": 0, 00:27:35.013 "w_mbytes_per_sec": 0 00:27:35.013 }, 00:27:35.013 "claimed": true, 00:27:35.013 "claim_type": "exclusive_write", 00:27:35.013 "zoned": false, 00:27:35.013 "supported_io_types": { 00:27:35.013 "read": true, 00:27:35.013 "write": true, 00:27:35.013 "unmap": true, 00:27:35.013 "flush": true, 00:27:35.013 "reset": true, 00:27:35.013 "nvme_admin": false, 00:27:35.013 "nvme_io": false, 00:27:35.013 "nvme_io_md": false, 00:27:35.013 "write_zeroes": true, 00:27:35.013 "zcopy": true, 00:27:35.013 "get_zone_info": false, 00:27:35.013 "zone_management": false, 00:27:35.013 "zone_append": false, 00:27:35.013 "compare": false, 00:27:35.013 "compare_and_write": false, 00:27:35.013 "abort": true, 00:27:35.013 "seek_hole": false, 00:27:35.013 "seek_data": false, 00:27:35.013 "copy": true, 00:27:35.013 "nvme_iov_md": false 00:27:35.013 }, 00:27:35.013 "memory_domains": [ 00:27:35.013 { 00:27:35.013 "dma_device_id": "system", 00:27:35.013 "dma_device_type": 1 00:27:35.013 }, 00:27:35.013 { 00:27:35.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.013 "dma_device_type": 2 00:27:35.013 } 00:27:35.013 ], 00:27:35.013 "driver_specific": { 00:27:35.013 "passthru": { 00:27:35.013 "name": "pt2", 00:27:35.013 "base_bdev_name": "malloc2" 00:27:35.013 } 00:27:35.013 } 00:27:35.013 }' 00:27:35.013 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.270 09:31:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:35.270 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:35.527 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:35.527 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:35.527 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:35.527 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:35.784 [2024-07-15 09:31:44.507175] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:35.784 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7c628d48-600d-4ef2-bb0a-aeb51599ee36 00:27:35.784 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 7c628d48-600d-4ef2-bb0a-aeb51599ee36 ']' 00:27:35.784 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:36.042 [2024-07-15 09:31:44.751568] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:36.042 [2024-07-15 09:31:44.751590] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:36.042 [2024-07-15 09:31:44.751647] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:36.042 [2024-07-15 09:31:44.751701] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:36.042 [2024-07-15 09:31:44.751713] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x239fd20 name raid_bdev1, state offline 00:27:36.042 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.042 09:31:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:36.299 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:36.557 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:36.557 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:36.815 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:37.074 [2024-07-15 09:31:45.970742] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:37.074 [2024-07-15 09:31:45.972103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:37.074 [2024-07-15 09:31:45.972159] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:37.074 [2024-07-15 09:31:45.972199] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:37.074 [2024-07-15 09:31:45.972217] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:37.074 [2024-07-15 09:31:45.972227] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220fed0 name raid_bdev1, state configuring 00:27:37.074 request: 00:27:37.074 { 00:27:37.074 "name": "raid_bdev1", 00:27:37.074 "raid_level": "raid1", 00:27:37.074 "base_bdevs": [ 00:27:37.074 "malloc1", 00:27:37.074 "malloc2" 00:27:37.074 ], 00:27:37.074 "superblock": false, 00:27:37.074 "method": "bdev_raid_create", 00:27:37.074 "req_id": 1 00:27:37.074 } 00:27:37.074 Got JSON-RPC error response 00:27:37.074 response: 00:27:37.074 { 00:27:37.074 "code": -17, 00:27:37.074 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:37.074 } 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.074 09:31:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:37.332 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:37.332 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:37.332 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:37.591 [2024-07-15 09:31:46.459961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:37.591 [2024-07-15 09:31:46.460003] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.591 [2024-07-15 09:31:46.460020] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23adee0 00:27:37.591 [2024-07-15 09:31:46.460033] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.591 [2024-07-15 09:31:46.461502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.591 [2024-07-15 09:31:46.461529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:37.591 [2024-07-15 09:31:46.461577] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:37.591 [2024-07-15 09:31:46.461602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:37.591 pt1 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.591 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.850 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.850 "name": "raid_bdev1", 00:27:37.850 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:37.850 "strip_size_kb": 0, 00:27:37.850 "state": "configuring", 00:27:37.850 "raid_level": "raid1", 00:27:37.850 "superblock": true, 00:27:37.850 "num_base_bdevs": 2, 00:27:37.850 "num_base_bdevs_discovered": 1, 00:27:37.850 "num_base_bdevs_operational": 2, 00:27:37.850 "base_bdevs_list": [ 00:27:37.850 { 00:27:37.850 "name": "pt1", 00:27:37.850 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:37.850 "is_configured": true, 00:27:37.850 "data_offset": 256, 00:27:37.850 "data_size": 7936 00:27:37.850 }, 00:27:37.850 { 00:27:37.850 "name": null, 00:27:37.850 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:37.850 "is_configured": false, 00:27:37.850 "data_offset": 256, 00:27:37.850 "data_size": 7936 00:27:37.850 } 00:27:37.850 ] 00:27:37.850 }' 00:27:37.850 09:31:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.850 09:31:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:38.417 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:38.417 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:38.417 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:38.417 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:38.676 [2024-07-15 09:31:47.591128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:38.676 [2024-07-15 09:31:47.591176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:38.676 [2024-07-15 09:31:47.591195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2210490 00:27:38.676 [2024-07-15 09:31:47.591207] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:38.676 [2024-07-15 09:31:47.591390] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:38.676 [2024-07-15 09:31:47.591407] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:38.676 [2024-07-15 09:31:47.591450] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:38.676 [2024-07-15 09:31:47.591467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:38.676 [2024-07-15 09:31:47.591556] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23945d0 00:27:38.676 [2024-07-15 09:31:47.591567] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:38.676 [2024-07-15 09:31:47.591621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2395800 00:27:38.676 [2024-07-15 09:31:47.591719] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23945d0 00:27:38.677 [2024-07-15 09:31:47.591729] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23945d0 00:27:38.677 [2024-07-15 09:31:47.591796] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.677 pt2 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.677 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.935 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.935 "name": "raid_bdev1", 00:27:38.935 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:38.935 "strip_size_kb": 0, 00:27:38.935 "state": "online", 00:27:38.935 "raid_level": "raid1", 00:27:38.935 "superblock": true, 00:27:38.935 "num_base_bdevs": 2, 00:27:38.935 "num_base_bdevs_discovered": 2, 00:27:38.935 "num_base_bdevs_operational": 2, 00:27:38.935 "base_bdevs_list": [ 00:27:38.935 { 00:27:38.935 "name": "pt1", 00:27:38.935 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:38.936 "is_configured": true, 00:27:38.936 "data_offset": 256, 00:27:38.936 "data_size": 7936 00:27:38.936 }, 00:27:38.936 { 00:27:38.936 "name": "pt2", 00:27:38.936 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:38.936 "is_configured": true, 00:27:38.936 "data_offset": 256, 00:27:38.936 "data_size": 7936 00:27:38.936 } 00:27:38.936 ] 00:27:38.936 }' 00:27:38.936 09:31:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.936 09:31:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:39.872 [2024-07-15 09:31:48.682273] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:39.872 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:39.872 "name": "raid_bdev1", 00:27:39.872 "aliases": [ 00:27:39.872 "7c628d48-600d-4ef2-bb0a-aeb51599ee36" 00:27:39.872 ], 00:27:39.872 "product_name": "Raid Volume", 00:27:39.872 "block_size": 4096, 00:27:39.872 "num_blocks": 7936, 00:27:39.872 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:39.872 "md_size": 32, 00:27:39.872 "md_interleave": false, 00:27:39.872 "dif_type": 0, 00:27:39.872 "assigned_rate_limits": { 00:27:39.872 "rw_ios_per_sec": 0, 00:27:39.873 "rw_mbytes_per_sec": 0, 00:27:39.873 "r_mbytes_per_sec": 0, 00:27:39.873 "w_mbytes_per_sec": 0 00:27:39.873 }, 00:27:39.873 "claimed": false, 00:27:39.873 "zoned": false, 00:27:39.873 "supported_io_types": { 00:27:39.873 "read": true, 00:27:39.873 "write": true, 00:27:39.873 "unmap": false, 00:27:39.873 "flush": false, 00:27:39.873 "reset": true, 00:27:39.873 "nvme_admin": false, 00:27:39.873 "nvme_io": false, 00:27:39.873 "nvme_io_md": false, 00:27:39.873 "write_zeroes": true, 00:27:39.873 "zcopy": false, 00:27:39.873 "get_zone_info": false, 00:27:39.873 "zone_management": false, 00:27:39.873 "zone_append": false, 00:27:39.873 "compare": false, 00:27:39.873 "compare_and_write": false, 00:27:39.873 "abort": false, 00:27:39.873 "seek_hole": false, 00:27:39.873 "seek_data": false, 00:27:39.873 "copy": false, 00:27:39.873 "nvme_iov_md": false 00:27:39.873 }, 00:27:39.873 "memory_domains": [ 00:27:39.873 { 00:27:39.873 "dma_device_id": "system", 00:27:39.873 "dma_device_type": 1 00:27:39.873 }, 00:27:39.873 { 00:27:39.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:39.873 "dma_device_type": 2 00:27:39.873 }, 00:27:39.873 { 00:27:39.873 "dma_device_id": "system", 00:27:39.873 "dma_device_type": 1 00:27:39.873 }, 00:27:39.873 { 00:27:39.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:39.873 "dma_device_type": 2 00:27:39.873 } 00:27:39.873 ], 00:27:39.873 "driver_specific": { 00:27:39.873 "raid": { 00:27:39.873 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:39.873 "strip_size_kb": 0, 00:27:39.873 "state": "online", 00:27:39.873 "raid_level": "raid1", 00:27:39.873 "superblock": true, 00:27:39.873 "num_base_bdevs": 2, 00:27:39.873 "num_base_bdevs_discovered": 2, 00:27:39.873 "num_base_bdevs_operational": 2, 00:27:39.873 "base_bdevs_list": [ 00:27:39.873 { 00:27:39.873 "name": "pt1", 00:27:39.873 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:39.873 "is_configured": true, 00:27:39.873 "data_offset": 256, 00:27:39.873 "data_size": 7936 00:27:39.873 }, 00:27:39.873 { 00:27:39.873 "name": "pt2", 00:27:39.873 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:39.873 "is_configured": true, 00:27:39.873 "data_offset": 256, 00:27:39.873 "data_size": 7936 00:27:39.873 } 00:27:39.873 ] 00:27:39.873 } 00:27:39.873 } 00:27:39.873 }' 00:27:39.873 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:39.873 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:39.873 pt2' 00:27:39.873 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:39.873 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:39.873 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:40.132 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:40.132 "name": "pt1", 00:27:40.132 "aliases": [ 00:27:40.132 "00000000-0000-0000-0000-000000000001" 00:27:40.132 ], 00:27:40.132 "product_name": "passthru", 00:27:40.132 "block_size": 4096, 00:27:40.132 "num_blocks": 8192, 00:27:40.132 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:40.132 "md_size": 32, 00:27:40.132 "md_interleave": false, 00:27:40.132 "dif_type": 0, 00:27:40.132 "assigned_rate_limits": { 00:27:40.132 "rw_ios_per_sec": 0, 00:27:40.132 "rw_mbytes_per_sec": 0, 00:27:40.132 "r_mbytes_per_sec": 0, 00:27:40.132 "w_mbytes_per_sec": 0 00:27:40.132 }, 00:27:40.132 "claimed": true, 00:27:40.132 "claim_type": "exclusive_write", 00:27:40.132 "zoned": false, 00:27:40.132 "supported_io_types": { 00:27:40.132 "read": true, 00:27:40.132 "write": true, 00:27:40.132 "unmap": true, 00:27:40.132 "flush": true, 00:27:40.132 "reset": true, 00:27:40.132 "nvme_admin": false, 00:27:40.132 "nvme_io": false, 00:27:40.132 "nvme_io_md": false, 00:27:40.132 "write_zeroes": true, 00:27:40.132 "zcopy": true, 00:27:40.132 "get_zone_info": false, 00:27:40.132 "zone_management": false, 00:27:40.132 "zone_append": false, 00:27:40.132 "compare": false, 00:27:40.132 "compare_and_write": false, 00:27:40.132 "abort": true, 00:27:40.132 "seek_hole": false, 00:27:40.132 "seek_data": false, 00:27:40.132 "copy": true, 00:27:40.132 "nvme_iov_md": false 00:27:40.132 }, 00:27:40.132 "memory_domains": [ 00:27:40.132 { 00:27:40.132 "dma_device_id": "system", 00:27:40.132 "dma_device_type": 1 00:27:40.132 }, 00:27:40.132 { 00:27:40.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:40.132 "dma_device_type": 2 00:27:40.132 } 00:27:40.132 ], 00:27:40.132 "driver_specific": { 00:27:40.132 "passthru": { 00:27:40.132 "name": "pt1", 00:27:40.132 "base_bdev_name": "malloc1" 00:27:40.132 } 00:27:40.132 } 00:27:40.132 }' 00:27:40.132 09:31:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.132 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:40.391 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:40.650 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:40.650 "name": "pt2", 00:27:40.650 "aliases": [ 00:27:40.650 "00000000-0000-0000-0000-000000000002" 00:27:40.650 ], 00:27:40.650 "product_name": "passthru", 00:27:40.650 "block_size": 4096, 00:27:40.650 "num_blocks": 8192, 00:27:40.650 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:40.650 "md_size": 32, 00:27:40.650 "md_interleave": false, 00:27:40.650 "dif_type": 0, 00:27:40.650 "assigned_rate_limits": { 00:27:40.650 "rw_ios_per_sec": 0, 00:27:40.650 "rw_mbytes_per_sec": 0, 00:27:40.650 "r_mbytes_per_sec": 0, 00:27:40.650 "w_mbytes_per_sec": 0 00:27:40.650 }, 00:27:40.650 "claimed": true, 00:27:40.650 "claim_type": "exclusive_write", 00:27:40.650 "zoned": false, 00:27:40.650 "supported_io_types": { 00:27:40.650 "read": true, 00:27:40.650 "write": true, 00:27:40.650 "unmap": true, 00:27:40.650 "flush": true, 00:27:40.650 "reset": true, 00:27:40.650 "nvme_admin": false, 00:27:40.650 "nvme_io": false, 00:27:40.650 "nvme_io_md": false, 00:27:40.650 "write_zeroes": true, 00:27:40.650 "zcopy": true, 00:27:40.650 "get_zone_info": false, 00:27:40.650 "zone_management": false, 00:27:40.650 "zone_append": false, 00:27:40.650 "compare": false, 00:27:40.650 "compare_and_write": false, 00:27:40.650 "abort": true, 00:27:40.650 "seek_hole": false, 00:27:40.650 "seek_data": false, 00:27:40.650 "copy": true, 00:27:40.650 "nvme_iov_md": false 00:27:40.650 }, 00:27:40.650 "memory_domains": [ 00:27:40.650 { 00:27:40.650 "dma_device_id": "system", 00:27:40.650 "dma_device_type": 1 00:27:40.650 }, 00:27:40.650 { 00:27:40.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:40.650 "dma_device_type": 2 00:27:40.650 } 00:27:40.650 ], 00:27:40.650 "driver_specific": { 00:27:40.650 "passthru": { 00:27:40.650 "name": "pt2", 00:27:40.650 "base_bdev_name": "malloc2" 00:27:40.650 } 00:27:40.650 } 00:27:40.650 }' 00:27:40.650 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:40.909 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.168 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.168 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:41.168 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:41.168 09:31:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:41.427 [2024-07-15 09:31:50.162257] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:41.427 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 7c628d48-600d-4ef2-bb0a-aeb51599ee36 '!=' 7c628d48-600d-4ef2-bb0a-aeb51599ee36 ']' 00:27:41.427 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:41.427 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:41.427 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:41.427 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:41.686 [2024-07-15 09:31:50.414666] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.686 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.945 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:41.945 "name": "raid_bdev1", 00:27:41.945 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:41.946 "strip_size_kb": 0, 00:27:41.946 "state": "online", 00:27:41.946 "raid_level": "raid1", 00:27:41.946 "superblock": true, 00:27:41.946 "num_base_bdevs": 2, 00:27:41.946 "num_base_bdevs_discovered": 1, 00:27:41.946 "num_base_bdevs_operational": 1, 00:27:41.946 "base_bdevs_list": [ 00:27:41.946 { 00:27:41.946 "name": null, 00:27:41.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.946 "is_configured": false, 00:27:41.946 "data_offset": 256, 00:27:41.946 "data_size": 7936 00:27:41.946 }, 00:27:41.946 { 00:27:41.946 "name": "pt2", 00:27:41.946 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:41.946 "is_configured": true, 00:27:41.946 "data_offset": 256, 00:27:41.946 "data_size": 7936 00:27:41.946 } 00:27:41.946 ] 00:27:41.946 }' 00:27:41.946 09:31:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:41.946 09:31:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:42.514 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:42.773 [2024-07-15 09:31:51.525589] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:42.773 [2024-07-15 09:31:51.525615] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:42.773 [2024-07-15 09:31:51.525669] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:42.773 [2024-07-15 09:31:51.525711] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:42.773 [2024-07-15 09:31:51.525723] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23945d0 name raid_bdev1, state offline 00:27:42.773 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.773 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:43.031 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:43.031 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:43.031 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:43.031 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:43.031 09:31:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:27:43.290 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:43.549 [2024-07-15 09:31:52.247465] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:43.549 [2024-07-15 09:31:52.247511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:43.549 [2024-07-15 09:31:52.247530] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2392660 00:27:43.549 [2024-07-15 09:31:52.247543] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:43.549 [2024-07-15 09:31:52.249037] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:43.549 [2024-07-15 09:31:52.249066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:43.549 [2024-07-15 09:31:52.249117] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:43.549 [2024-07-15 09:31:52.249143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:43.549 [2024-07-15 09:31:52.249222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2394d10 00:27:43.549 [2024-07-15 09:31:52.249233] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:43.549 [2024-07-15 09:31:52.249289] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2395560 00:27:43.549 [2024-07-15 09:31:52.249385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2394d10 00:27:43.549 [2024-07-15 09:31:52.249394] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2394d10 00:27:43.549 [2024-07-15 09:31:52.249460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.549 pt2 00:27:43.549 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.549 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.550 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.808 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.808 "name": "raid_bdev1", 00:27:43.808 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:43.808 "strip_size_kb": 0, 00:27:43.808 "state": "online", 00:27:43.808 "raid_level": "raid1", 00:27:43.808 "superblock": true, 00:27:43.808 "num_base_bdevs": 2, 00:27:43.808 "num_base_bdevs_discovered": 1, 00:27:43.808 "num_base_bdevs_operational": 1, 00:27:43.808 "base_bdevs_list": [ 00:27:43.808 { 00:27:43.808 "name": null, 00:27:43.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.808 "is_configured": false, 00:27:43.808 "data_offset": 256, 00:27:43.808 "data_size": 7936 00:27:43.808 }, 00:27:43.808 { 00:27:43.808 "name": "pt2", 00:27:43.808 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:43.808 "is_configured": true, 00:27:43.808 "data_offset": 256, 00:27:43.808 "data_size": 7936 00:27:43.808 } 00:27:43.808 ] 00:27:43.808 }' 00:27:43.808 09:31:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.808 09:31:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:44.379 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:44.379 [2024-07-15 09:31:53.262348] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:44.379 [2024-07-15 09:31:53.262373] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:44.379 [2024-07-15 09:31:53.262425] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:44.379 [2024-07-15 09:31:53.262466] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:44.379 [2024-07-15 09:31:53.262478] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2394d10 name raid_bdev1, state offline 00:27:44.379 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.379 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:44.637 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:44.637 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:44.637 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:44.637 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:44.894 [2024-07-15 09:31:53.619274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:44.894 [2024-07-15 09:31:53.619319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:44.894 [2024-07-15 09:31:53.619336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2393760 00:27:44.894 [2024-07-15 09:31:53.619349] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:44.894 [2024-07-15 09:31:53.620763] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:44.894 [2024-07-15 09:31:53.620790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:44.894 [2024-07-15 09:31:53.620836] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:44.894 [2024-07-15 09:31:53.620860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:44.894 [2024-07-15 09:31:53.620959] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:44.894 [2024-07-15 09:31:53.620972] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:44.894 [2024-07-15 09:31:53.620986] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2395850 name raid_bdev1, state configuring 00:27:44.894 [2024-07-15 09:31:53.621009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:44.894 [2024-07-15 09:31:53.621059] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2394850 00:27:44.894 [2024-07-15 09:31:53.621070] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:44.894 [2024-07-15 09:31:53.621131] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23953b0 00:27:44.894 [2024-07-15 09:31:53.621228] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2394850 00:27:44.894 [2024-07-15 09:31:53.621238] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2394850 00:27:44.894 [2024-07-15 09:31:53.621311] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.894 pt1 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:44.894 "name": "raid_bdev1", 00:27:44.894 "uuid": "7c628d48-600d-4ef2-bb0a-aeb51599ee36", 00:27:44.894 "strip_size_kb": 0, 00:27:44.894 "state": "online", 00:27:44.894 "raid_level": "raid1", 00:27:44.894 "superblock": true, 00:27:44.894 "num_base_bdevs": 2, 00:27:44.894 "num_base_bdevs_discovered": 1, 00:27:44.894 "num_base_bdevs_operational": 1, 00:27:44.894 "base_bdevs_list": [ 00:27:44.894 { 00:27:44.894 "name": null, 00:27:44.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.894 "is_configured": false, 00:27:44.894 "data_offset": 256, 00:27:44.894 "data_size": 7936 00:27:44.894 }, 00:27:44.894 { 00:27:44.894 "name": "pt2", 00:27:44.894 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:44.894 "is_configured": true, 00:27:44.894 "data_offset": 256, 00:27:44.894 "data_size": 7936 00:27:44.894 } 00:27:44.894 ] 00:27:44.894 }' 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:44.894 09:31:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:45.497 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:45.497 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:45.756 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:45.756 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:45.756 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:46.015 [2024-07-15 09:31:54.906941] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 7c628d48-600d-4ef2-bb0a-aeb51599ee36 '!=' 7c628d48-600d-4ef2-bb0a-aeb51599ee36 ']' 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 233204 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 233204 ']' 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 233204 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:46.015 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 233204 00:27:46.275 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:46.275 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:46.275 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 233204' 00:27:46.275 killing process with pid 233204 00:27:46.275 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 233204 00:27:46.275 [2024-07-15 09:31:54.977688] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:46.275 [2024-07-15 09:31:54.977738] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:46.275 [2024-07-15 09:31:54.977782] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:46.275 [2024-07-15 09:31:54.977794] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2394850 name raid_bdev1, state offline 00:27:46.275 09:31:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 233204 00:27:46.275 [2024-07-15 09:31:55.003189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:46.275 09:31:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:27:46.275 00:27:46.275 real 0m15.509s 00:27:46.275 user 0m28.065s 00:27:46.275 sys 0m2.850s 00:27:46.275 09:31:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:46.275 09:31:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.275 ************************************ 00:27:46.275 END TEST raid_superblock_test_md_separate 00:27:46.275 ************************************ 00:27:46.534 09:31:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:46.534 09:31:55 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:27:46.534 09:31:55 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:27:46.534 09:31:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:46.534 09:31:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:46.534 09:31:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:46.534 ************************************ 00:27:46.534 START TEST raid_rebuild_test_sb_md_separate 00:27:46.534 ************************************ 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=235545 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 235545 /var/tmp/spdk-raid.sock 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 235545 ']' 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:46.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:46.534 09:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.534 [2024-07-15 09:31:55.371995] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:27:46.534 [2024-07-15 09:31:55.372064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid235545 ] 00:27:46.534 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:46.534 Zero copy mechanism will not be used. 00:27:46.793 [2024-07-15 09:31:55.499447] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:46.793 [2024-07-15 09:31:55.609480] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:46.793 [2024-07-15 09:31:55.674844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:46.793 [2024-07-15 09:31:55.674893] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:47.361 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:47.361 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:47.361 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:47.361 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:47.619 BaseBdev1_malloc 00:27:47.619 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:47.877 [2024-07-15 09:31:56.604982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:47.877 [2024-07-15 09:31:56.605027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:47.877 [2024-07-15 09:31:56.605055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16416d0 00:27:47.877 [2024-07-15 09:31:56.605069] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:47.877 [2024-07-15 09:31:56.606461] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:47.877 [2024-07-15 09:31:56.606489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:47.877 BaseBdev1 00:27:47.877 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:47.877 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:48.136 BaseBdev2_malloc 00:27:48.136 09:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:48.403 [2024-07-15 09:31:57.109166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:48.403 [2024-07-15 09:31:57.109211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.403 [2024-07-15 09:31:57.109238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17991f0 00:27:48.403 [2024-07-15 09:31:57.109250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.403 [2024-07-15 09:31:57.110664] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.403 [2024-07-15 09:31:57.110689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:48.403 BaseBdev2 00:27:48.403 09:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:48.662 spare_malloc 00:27:48.662 09:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:48.662 spare_delay 00:27:48.921 09:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:48.921 [2024-07-15 09:31:57.833711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:48.921 [2024-07-15 09:31:57.833754] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.921 [2024-07-15 09:31:57.833779] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17957a0 00:27:48.921 [2024-07-15 09:31:57.833791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.921 [2024-07-15 09:31:57.835167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.921 [2024-07-15 09:31:57.835194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:48.921 spare 00:27:48.921 09:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:49.180 [2024-07-15 09:31:58.078380] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:49.180 [2024-07-15 09:31:58.079688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:49.180 [2024-07-15 09:31:58.079858] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17961c0 00:27:49.180 [2024-07-15 09:31:58.079871] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:49.180 [2024-07-15 09:31:58.079960] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a7360 00:27:49.180 [2024-07-15 09:31:58.080074] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17961c0 00:27:49.180 [2024-07-15 09:31:58.080084] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17961c0 00:27:49.180 [2024-07-15 09:31:58.080154] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:49.180 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:49.180 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:49.180 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.181 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.440 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.440 "name": "raid_bdev1", 00:27:49.440 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:49.440 "strip_size_kb": 0, 00:27:49.440 "state": "online", 00:27:49.440 "raid_level": "raid1", 00:27:49.440 "superblock": true, 00:27:49.440 "num_base_bdevs": 2, 00:27:49.440 "num_base_bdevs_discovered": 2, 00:27:49.440 "num_base_bdevs_operational": 2, 00:27:49.440 "base_bdevs_list": [ 00:27:49.440 { 00:27:49.440 "name": "BaseBdev1", 00:27:49.440 "uuid": "536f2556-36fe-56e5-a3be-0436306df943", 00:27:49.440 "is_configured": true, 00:27:49.440 "data_offset": 256, 00:27:49.440 "data_size": 7936 00:27:49.440 }, 00:27:49.440 { 00:27:49.440 "name": "BaseBdev2", 00:27:49.440 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:49.440 "is_configured": true, 00:27:49.440 "data_offset": 256, 00:27:49.440 "data_size": 7936 00:27:49.440 } 00:27:49.440 ] 00:27:49.440 }' 00:27:49.440 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.440 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:50.006 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:50.006 09:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:50.265 [2024-07-15 09:31:59.141563] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:50.265 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:50.265 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.265 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:50.524 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:50.781 [2024-07-15 09:31:59.634668] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a7360 00:27:50.782 /dev/nbd0 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:50.782 1+0 records in 00:27:50.782 1+0 records out 00:27:50.782 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247316 s, 16.6 MB/s 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:50.782 09:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:51.717 7936+0 records in 00:27:51.717 7936+0 records out 00:27:51.717 32505856 bytes (33 MB, 31 MiB) copied, 0.757684 s, 42.9 MB/s 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:51.717 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:51.976 [2024-07-15 09:32:00.729134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:51.976 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:52.233 [2024-07-15 09:32:00.961797] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.233 09:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.233 09:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.233 "name": "raid_bdev1", 00:27:52.233 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:52.233 "strip_size_kb": 0, 00:27:52.233 "state": "online", 00:27:52.233 "raid_level": "raid1", 00:27:52.233 "superblock": true, 00:27:52.233 "num_base_bdevs": 2, 00:27:52.233 "num_base_bdevs_discovered": 1, 00:27:52.233 "num_base_bdevs_operational": 1, 00:27:52.233 "base_bdevs_list": [ 00:27:52.233 { 00:27:52.233 "name": null, 00:27:52.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.233 "is_configured": false, 00:27:52.233 "data_offset": 256, 00:27:52.233 "data_size": 7936 00:27:52.233 }, 00:27:52.233 { 00:27:52.233 "name": "BaseBdev2", 00:27:52.233 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:52.233 "is_configured": true, 00:27:52.233 "data_offset": 256, 00:27:52.233 "data_size": 7936 00:27:52.233 } 00:27:52.233 ] 00:27:52.233 }' 00:27:52.233 09:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.233 09:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:53.167 09:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:53.167 [2024-07-15 09:32:01.980507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:53.167 [2024-07-15 09:32:01.982822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1640350 00:27:53.167 [2024-07-15 09:32:01.985131] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:53.167 09:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.103 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.362 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:54.362 "name": "raid_bdev1", 00:27:54.362 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:54.362 "strip_size_kb": 0, 00:27:54.362 "state": "online", 00:27:54.362 "raid_level": "raid1", 00:27:54.362 "superblock": true, 00:27:54.362 "num_base_bdevs": 2, 00:27:54.362 "num_base_bdevs_discovered": 2, 00:27:54.362 "num_base_bdevs_operational": 2, 00:27:54.362 "process": { 00:27:54.362 "type": "rebuild", 00:27:54.362 "target": "spare", 00:27:54.362 "progress": { 00:27:54.362 "blocks": 3072, 00:27:54.362 "percent": 38 00:27:54.362 } 00:27:54.362 }, 00:27:54.362 "base_bdevs_list": [ 00:27:54.362 { 00:27:54.362 "name": "spare", 00:27:54.362 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:27:54.362 "is_configured": true, 00:27:54.362 "data_offset": 256, 00:27:54.362 "data_size": 7936 00:27:54.362 }, 00:27:54.362 { 00:27:54.362 "name": "BaseBdev2", 00:27:54.362 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:54.362 "is_configured": true, 00:27:54.362 "data_offset": 256, 00:27:54.362 "data_size": 7936 00:27:54.362 } 00:27:54.362 ] 00:27:54.362 }' 00:27:54.362 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:54.362 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:54.362 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:54.620 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:54.620 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:54.879 [2024-07-15 09:32:03.578007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.879 [2024-07-15 09:32:03.598061] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:54.879 [2024-07-15 09:32:03.598110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.879 [2024-07-15 09:32:03.598125] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.879 [2024-07-15 09:32:03.598133] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.879 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.137 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.137 "name": "raid_bdev1", 00:27:55.137 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:55.137 "strip_size_kb": 0, 00:27:55.137 "state": "online", 00:27:55.137 "raid_level": "raid1", 00:27:55.137 "superblock": true, 00:27:55.137 "num_base_bdevs": 2, 00:27:55.137 "num_base_bdevs_discovered": 1, 00:27:55.137 "num_base_bdevs_operational": 1, 00:27:55.137 "base_bdevs_list": [ 00:27:55.137 { 00:27:55.137 "name": null, 00:27:55.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.137 "is_configured": false, 00:27:55.137 "data_offset": 256, 00:27:55.137 "data_size": 7936 00:27:55.137 }, 00:27:55.137 { 00:27:55.137 "name": "BaseBdev2", 00:27:55.137 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:55.137 "is_configured": true, 00:27:55.137 "data_offset": 256, 00:27:55.137 "data_size": 7936 00:27:55.137 } 00:27:55.137 ] 00:27:55.137 }' 00:27:55.137 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.137 09:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:55.704 "name": "raid_bdev1", 00:27:55.704 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:55.704 "strip_size_kb": 0, 00:27:55.704 "state": "online", 00:27:55.704 "raid_level": "raid1", 00:27:55.704 "superblock": true, 00:27:55.704 "num_base_bdevs": 2, 00:27:55.704 "num_base_bdevs_discovered": 1, 00:27:55.704 "num_base_bdevs_operational": 1, 00:27:55.704 "base_bdevs_list": [ 00:27:55.704 { 00:27:55.704 "name": null, 00:27:55.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.704 "is_configured": false, 00:27:55.704 "data_offset": 256, 00:27:55.704 "data_size": 7936 00:27:55.704 }, 00:27:55.704 { 00:27:55.704 "name": "BaseBdev2", 00:27:55.704 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:55.704 "is_configured": true, 00:27:55.704 "data_offset": 256, 00:27:55.704 "data_size": 7936 00:27:55.704 } 00:27:55.704 ] 00:27:55.704 }' 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:55.704 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:55.961 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:55.961 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:56.220 [2024-07-15 09:32:04.917691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:56.220 [2024-07-15 09:32:04.920031] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1641280 00:27:56.220 [2024-07-15 09:32:04.921586] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:56.220 09:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.153 09:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.413 "name": "raid_bdev1", 00:27:57.413 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:57.413 "strip_size_kb": 0, 00:27:57.413 "state": "online", 00:27:57.413 "raid_level": "raid1", 00:27:57.413 "superblock": true, 00:27:57.413 "num_base_bdevs": 2, 00:27:57.413 "num_base_bdevs_discovered": 2, 00:27:57.413 "num_base_bdevs_operational": 2, 00:27:57.413 "process": { 00:27:57.413 "type": "rebuild", 00:27:57.413 "target": "spare", 00:27:57.413 "progress": { 00:27:57.413 "blocks": 3072, 00:27:57.413 "percent": 38 00:27:57.413 } 00:27:57.413 }, 00:27:57.413 "base_bdevs_list": [ 00:27:57.413 { 00:27:57.413 "name": "spare", 00:27:57.413 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:27:57.413 "is_configured": true, 00:27:57.413 "data_offset": 256, 00:27:57.413 "data_size": 7936 00:27:57.413 }, 00:27:57.413 { 00:27:57.413 "name": "BaseBdev2", 00:27:57.413 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:57.413 "is_configured": true, 00:27:57.413 "data_offset": 256, 00:27:57.413 "data_size": 7936 00:27:57.413 } 00:27:57.413 ] 00:27:57.413 }' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:57.413 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1070 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.413 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.671 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.671 "name": "raid_bdev1", 00:27:57.671 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:57.671 "strip_size_kb": 0, 00:27:57.671 "state": "online", 00:27:57.671 "raid_level": "raid1", 00:27:57.671 "superblock": true, 00:27:57.671 "num_base_bdevs": 2, 00:27:57.671 "num_base_bdevs_discovered": 2, 00:27:57.671 "num_base_bdevs_operational": 2, 00:27:57.671 "process": { 00:27:57.671 "type": "rebuild", 00:27:57.671 "target": "spare", 00:27:57.671 "progress": { 00:27:57.671 "blocks": 3840, 00:27:57.671 "percent": 48 00:27:57.671 } 00:27:57.671 }, 00:27:57.671 "base_bdevs_list": [ 00:27:57.671 { 00:27:57.671 "name": "spare", 00:27:57.671 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:27:57.671 "is_configured": true, 00:27:57.671 "data_offset": 256, 00:27:57.671 "data_size": 7936 00:27:57.671 }, 00:27:57.671 { 00:27:57.671 "name": "BaseBdev2", 00:27:57.671 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:57.671 "is_configured": true, 00:27:57.671 "data_offset": 256, 00:27:57.671 "data_size": 7936 00:27:57.671 } 00:27:57.671 ] 00:27:57.671 }' 00:27:57.672 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.672 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.672 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.672 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.672 09:32:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.070 "name": "raid_bdev1", 00:27:59.070 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:27:59.070 "strip_size_kb": 0, 00:27:59.070 "state": "online", 00:27:59.070 "raid_level": "raid1", 00:27:59.070 "superblock": true, 00:27:59.070 "num_base_bdevs": 2, 00:27:59.070 "num_base_bdevs_discovered": 2, 00:27:59.070 "num_base_bdevs_operational": 2, 00:27:59.070 "process": { 00:27:59.070 "type": "rebuild", 00:27:59.070 "target": "spare", 00:27:59.070 "progress": { 00:27:59.070 "blocks": 7168, 00:27:59.070 "percent": 90 00:27:59.070 } 00:27:59.070 }, 00:27:59.070 "base_bdevs_list": [ 00:27:59.070 { 00:27:59.070 "name": "spare", 00:27:59.070 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:27:59.070 "is_configured": true, 00:27:59.070 "data_offset": 256, 00:27:59.070 "data_size": 7936 00:27:59.070 }, 00:27:59.070 { 00:27:59.070 "name": "BaseBdev2", 00:27:59.070 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:27:59.070 "is_configured": true, 00:27:59.070 "data_offset": 256, 00:27:59.070 "data_size": 7936 00:27:59.070 } 00:27:59.070 ] 00:27:59.070 }' 00:27:59.070 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.071 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.071 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.071 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:59.071 09:32:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:59.328 [2024-07-15 09:32:08.046054] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:59.328 [2024-07-15 09:32:08.046111] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:59.328 [2024-07-15 09:32:08.046192] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.319 09:32:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.319 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.319 "name": "raid_bdev1", 00:28:00.319 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:00.319 "strip_size_kb": 0, 00:28:00.319 "state": "online", 00:28:00.319 "raid_level": "raid1", 00:28:00.319 "superblock": true, 00:28:00.319 "num_base_bdevs": 2, 00:28:00.319 "num_base_bdevs_discovered": 2, 00:28:00.319 "num_base_bdevs_operational": 2, 00:28:00.319 "base_bdevs_list": [ 00:28:00.319 { 00:28:00.319 "name": "spare", 00:28:00.319 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:00.319 "is_configured": true, 00:28:00.319 "data_offset": 256, 00:28:00.319 "data_size": 7936 00:28:00.319 }, 00:28:00.319 { 00:28:00.319 "name": "BaseBdev2", 00:28:00.319 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:00.319 "is_configured": true, 00:28:00.319 "data_offset": 256, 00:28:00.319 "data_size": 7936 00:28:00.319 } 00:28:00.319 ] 00:28:00.319 }' 00:28:00.319 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.593 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.593 "name": "raid_bdev1", 00:28:00.593 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:00.593 "strip_size_kb": 0, 00:28:00.593 "state": "online", 00:28:00.593 "raid_level": "raid1", 00:28:00.593 "superblock": true, 00:28:00.593 "num_base_bdevs": 2, 00:28:00.593 "num_base_bdevs_discovered": 2, 00:28:00.593 "num_base_bdevs_operational": 2, 00:28:00.593 "base_bdevs_list": [ 00:28:00.593 { 00:28:00.593 "name": "spare", 00:28:00.593 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:00.593 "is_configured": true, 00:28:00.593 "data_offset": 256, 00:28:00.593 "data_size": 7936 00:28:00.593 }, 00:28:00.593 { 00:28:00.593 "name": "BaseBdev2", 00:28:00.593 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:00.593 "is_configured": true, 00:28:00.593 "data_offset": 256, 00:28:00.593 "data_size": 7936 00:28:00.593 } 00:28:00.593 ] 00:28:00.593 }' 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.852 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.110 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.110 "name": "raid_bdev1", 00:28:01.110 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:01.110 "strip_size_kb": 0, 00:28:01.110 "state": "online", 00:28:01.110 "raid_level": "raid1", 00:28:01.110 "superblock": true, 00:28:01.110 "num_base_bdevs": 2, 00:28:01.110 "num_base_bdevs_discovered": 2, 00:28:01.110 "num_base_bdevs_operational": 2, 00:28:01.110 "base_bdevs_list": [ 00:28:01.110 { 00:28:01.110 "name": "spare", 00:28:01.110 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:01.110 "is_configured": true, 00:28:01.110 "data_offset": 256, 00:28:01.110 "data_size": 7936 00:28:01.110 }, 00:28:01.110 { 00:28:01.110 "name": "BaseBdev2", 00:28:01.110 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:01.110 "is_configured": true, 00:28:01.110 "data_offset": 256, 00:28:01.110 "data_size": 7936 00:28:01.110 } 00:28:01.110 ] 00:28:01.110 }' 00:28:01.110 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.110 09:32:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:01.682 09:32:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:01.941 [2024-07-15 09:32:10.760936] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:01.941 [2024-07-15 09:32:10.760966] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:01.941 [2024-07-15 09:32:10.761024] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:01.941 [2024-07-15 09:32:10.761079] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:01.941 [2024-07-15 09:32:10.761092] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17961c0 name raid_bdev1, state offline 00:28:01.941 09:32:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:01.941 09:32:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:02.200 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:02.459 /dev/nbd0 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.459 1+0 records in 00:28:02.459 1+0 records out 00:28:02.459 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224939 s, 18.2 MB/s 00:28:02.459 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:02.460 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:02.719 /dev/nbd1 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.719 1+0 records in 00:28:02.719 1+0 records out 00:28:02.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00033729 s, 12.1 MB/s 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:02.719 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.720 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.979 09:32:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:03.239 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:03.499 [2024-07-15 09:32:12.403798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:03.499 [2024-07-15 09:32:12.403844] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:03.499 [2024-07-15 09:32:12.403867] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1795f60 00:28:03.499 [2024-07-15 09:32:12.403880] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:03.499 [2024-07-15 09:32:12.405350] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:03.499 [2024-07-15 09:32:12.405377] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:03.499 [2024-07-15 09:32:12.405435] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:03.499 [2024-07-15 09:32:12.405461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:03.499 [2024-07-15 09:32:12.405556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:03.499 spare 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.499 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.758 [2024-07-15 09:32:12.505863] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16a77c0 00:28:03.758 [2024-07-15 09:32:12.505882] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:03.758 [2024-07-15 09:32:12.505964] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a7480 00:28:03.758 [2024-07-15 09:32:12.506093] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16a77c0 00:28:03.758 [2024-07-15 09:32:12.506103] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16a77c0 00:28:03.758 [2024-07-15 09:32:12.506181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:03.758 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.758 "name": "raid_bdev1", 00:28:03.758 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:03.758 "strip_size_kb": 0, 00:28:03.758 "state": "online", 00:28:03.758 "raid_level": "raid1", 00:28:03.758 "superblock": true, 00:28:03.758 "num_base_bdevs": 2, 00:28:03.758 "num_base_bdevs_discovered": 2, 00:28:03.758 "num_base_bdevs_operational": 2, 00:28:03.758 "base_bdevs_list": [ 00:28:03.758 { 00:28:03.758 "name": "spare", 00:28:03.758 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:03.758 "is_configured": true, 00:28:03.758 "data_offset": 256, 00:28:03.758 "data_size": 7936 00:28:03.758 }, 00:28:03.758 { 00:28:03.758 "name": "BaseBdev2", 00:28:03.758 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:03.758 "is_configured": true, 00:28:03.758 "data_offset": 256, 00:28:03.758 "data_size": 7936 00:28:03.758 } 00:28:03.758 ] 00:28:03.758 }' 00:28:03.758 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.758 09:32:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.325 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.584 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:04.584 "name": "raid_bdev1", 00:28:04.584 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:04.584 "strip_size_kb": 0, 00:28:04.584 "state": "online", 00:28:04.584 "raid_level": "raid1", 00:28:04.584 "superblock": true, 00:28:04.584 "num_base_bdevs": 2, 00:28:04.584 "num_base_bdevs_discovered": 2, 00:28:04.584 "num_base_bdevs_operational": 2, 00:28:04.584 "base_bdevs_list": [ 00:28:04.584 { 00:28:04.584 "name": "spare", 00:28:04.584 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:04.584 "is_configured": true, 00:28:04.584 "data_offset": 256, 00:28:04.584 "data_size": 7936 00:28:04.584 }, 00:28:04.584 { 00:28:04.584 "name": "BaseBdev2", 00:28:04.584 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:04.584 "is_configured": true, 00:28:04.584 "data_offset": 256, 00:28:04.584 "data_size": 7936 00:28:04.584 } 00:28:04.584 ] 00:28:04.584 }' 00:28:04.584 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:04.585 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:04.585 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:04.844 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:04.844 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.844 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:05.104 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:05.104 09:32:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:05.104 [2024-07-15 09:32:14.048296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.363 "name": "raid_bdev1", 00:28:05.363 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:05.363 "strip_size_kb": 0, 00:28:05.363 "state": "online", 00:28:05.363 "raid_level": "raid1", 00:28:05.363 "superblock": true, 00:28:05.363 "num_base_bdevs": 2, 00:28:05.363 "num_base_bdevs_discovered": 1, 00:28:05.363 "num_base_bdevs_operational": 1, 00:28:05.363 "base_bdevs_list": [ 00:28:05.363 { 00:28:05.363 "name": null, 00:28:05.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:05.363 "is_configured": false, 00:28:05.363 "data_offset": 256, 00:28:05.363 "data_size": 7936 00:28:05.363 }, 00:28:05.363 { 00:28:05.363 "name": "BaseBdev2", 00:28:05.363 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:05.363 "is_configured": true, 00:28:05.363 "data_offset": 256, 00:28:05.363 "data_size": 7936 00:28:05.363 } 00:28:05.363 ] 00:28:05.363 }' 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.363 09:32:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:06.300 09:32:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:06.560 [2024-07-15 09:32:15.375823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:06.560 [2024-07-15 09:32:15.375988] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:06.560 [2024-07-15 09:32:15.376006] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:06.560 [2024-07-15 09:32:15.376033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:06.560 [2024-07-15 09:32:15.378495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1641280 00:28:06.560 [2024-07-15 09:32:15.379884] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:06.560 09:32:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.498 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.757 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.757 "name": "raid_bdev1", 00:28:07.757 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:07.757 "strip_size_kb": 0, 00:28:07.757 "state": "online", 00:28:07.757 "raid_level": "raid1", 00:28:07.757 "superblock": true, 00:28:07.757 "num_base_bdevs": 2, 00:28:07.757 "num_base_bdevs_discovered": 2, 00:28:07.757 "num_base_bdevs_operational": 2, 00:28:07.757 "process": { 00:28:07.757 "type": "rebuild", 00:28:07.757 "target": "spare", 00:28:07.757 "progress": { 00:28:07.757 "blocks": 3072, 00:28:07.757 "percent": 38 00:28:07.757 } 00:28:07.757 }, 00:28:07.757 "base_bdevs_list": [ 00:28:07.757 { 00:28:07.757 "name": "spare", 00:28:07.757 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:07.757 "is_configured": true, 00:28:07.757 "data_offset": 256, 00:28:07.757 "data_size": 7936 00:28:07.757 }, 00:28:07.757 { 00:28:07.757 "name": "BaseBdev2", 00:28:07.757 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:07.757 "is_configured": true, 00:28:07.757 "data_offset": 256, 00:28:07.757 "data_size": 7936 00:28:07.757 } 00:28:07.757 ] 00:28:07.757 }' 00:28:07.757 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.757 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:07.757 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:08.015 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:08.015 09:32:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:08.015 [2024-07-15 09:32:16.964961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.274 [2024-07-15 09:32:16.992733] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:08.274 [2024-07-15 09:32:16.992778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.274 [2024-07-15 09:32:16.992794] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.274 [2024-07-15 09:32:16.992803] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.274 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.534 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.534 "name": "raid_bdev1", 00:28:08.534 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:08.534 "strip_size_kb": 0, 00:28:08.534 "state": "online", 00:28:08.534 "raid_level": "raid1", 00:28:08.534 "superblock": true, 00:28:08.534 "num_base_bdevs": 2, 00:28:08.534 "num_base_bdevs_discovered": 1, 00:28:08.534 "num_base_bdevs_operational": 1, 00:28:08.534 "base_bdevs_list": [ 00:28:08.534 { 00:28:08.534 "name": null, 00:28:08.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:08.534 "is_configured": false, 00:28:08.534 "data_offset": 256, 00:28:08.534 "data_size": 7936 00:28:08.534 }, 00:28:08.534 { 00:28:08.534 "name": "BaseBdev2", 00:28:08.534 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:08.534 "is_configured": true, 00:28:08.534 "data_offset": 256, 00:28:08.534 "data_size": 7936 00:28:08.534 } 00:28:08.534 ] 00:28:08.534 }' 00:28:08.534 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.534 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:09.103 09:32:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:09.103 [2024-07-15 09:32:17.990528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:09.103 [2024-07-15 09:32:17.990578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:09.103 [2024-07-15 09:32:17.990605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a6ca0 00:28:09.103 [2024-07-15 09:32:17.990618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:09.103 [2024-07-15 09:32:17.990833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:09.103 [2024-07-15 09:32:17.990849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:09.103 [2024-07-15 09:32:17.990907] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:09.103 [2024-07-15 09:32:17.990918] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:09.103 [2024-07-15 09:32:17.990938] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:09.103 [2024-07-15 09:32:17.990956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:09.103 [2024-07-15 09:32:17.993153] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1641280 00:28:09.103 [2024-07-15 09:32:17.994498] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:09.103 spare 00:28:09.103 09:32:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:10.484 "name": "raid_bdev1", 00:28:10.484 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:10.484 "strip_size_kb": 0, 00:28:10.484 "state": "online", 00:28:10.484 "raid_level": "raid1", 00:28:10.484 "superblock": true, 00:28:10.484 "num_base_bdevs": 2, 00:28:10.484 "num_base_bdevs_discovered": 2, 00:28:10.484 "num_base_bdevs_operational": 2, 00:28:10.484 "process": { 00:28:10.484 "type": "rebuild", 00:28:10.484 "target": "spare", 00:28:10.484 "progress": { 00:28:10.484 "blocks": 3072, 00:28:10.484 "percent": 38 00:28:10.484 } 00:28:10.484 }, 00:28:10.484 "base_bdevs_list": [ 00:28:10.484 { 00:28:10.484 "name": "spare", 00:28:10.484 "uuid": "c820db16-0d6f-5aad-9df7-78fd0fbbc975", 00:28:10.484 "is_configured": true, 00:28:10.484 "data_offset": 256, 00:28:10.484 "data_size": 7936 00:28:10.484 }, 00:28:10.484 { 00:28:10.484 "name": "BaseBdev2", 00:28:10.484 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:10.484 "is_configured": true, 00:28:10.484 "data_offset": 256, 00:28:10.484 "data_size": 7936 00:28:10.484 } 00:28:10.484 ] 00:28:10.484 }' 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:10.484 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:11.053 [2024-07-15 09:32:19.821144] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:11.053 [2024-07-15 09:32:19.909340] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:11.053 [2024-07-15 09:32:19.909386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.053 [2024-07-15 09:32:19.909402] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:11.053 [2024-07-15 09:32:19.909411] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.053 09:32:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.313 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.313 "name": "raid_bdev1", 00:28:11.313 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:11.313 "strip_size_kb": 0, 00:28:11.313 "state": "online", 00:28:11.313 "raid_level": "raid1", 00:28:11.313 "superblock": true, 00:28:11.313 "num_base_bdevs": 2, 00:28:11.313 "num_base_bdevs_discovered": 1, 00:28:11.313 "num_base_bdevs_operational": 1, 00:28:11.313 "base_bdevs_list": [ 00:28:11.313 { 00:28:11.313 "name": null, 00:28:11.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.313 "is_configured": false, 00:28:11.313 "data_offset": 256, 00:28:11.313 "data_size": 7936 00:28:11.313 }, 00:28:11.313 { 00:28:11.313 "name": "BaseBdev2", 00:28:11.313 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:11.313 "is_configured": true, 00:28:11.313 "data_offset": 256, 00:28:11.313 "data_size": 7936 00:28:11.313 } 00:28:11.313 ] 00:28:11.313 }' 00:28:11.313 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.313 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.882 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.142 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:12.142 "name": "raid_bdev1", 00:28:12.142 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:12.142 "strip_size_kb": 0, 00:28:12.142 "state": "online", 00:28:12.142 "raid_level": "raid1", 00:28:12.142 "superblock": true, 00:28:12.142 "num_base_bdevs": 2, 00:28:12.142 "num_base_bdevs_discovered": 1, 00:28:12.142 "num_base_bdevs_operational": 1, 00:28:12.142 "base_bdevs_list": [ 00:28:12.142 { 00:28:12.142 "name": null, 00:28:12.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.142 "is_configured": false, 00:28:12.142 "data_offset": 256, 00:28:12.142 "data_size": 7936 00:28:12.142 }, 00:28:12.142 { 00:28:12.142 "name": "BaseBdev2", 00:28:12.142 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:12.142 "is_configured": true, 00:28:12.142 "data_offset": 256, 00:28:12.142 "data_size": 7936 00:28:12.142 } 00:28:12.142 ] 00:28:12.142 }' 00:28:12.142 09:32:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:12.142 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:12.142 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:12.142 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:12.142 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:12.401 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:12.659 [2024-07-15 09:32:21.434121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:12.659 [2024-07-15 09:32:21.434169] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:12.659 [2024-07-15 09:32:21.434192] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1641900 00:28:12.659 [2024-07-15 09:32:21.434204] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:12.659 [2024-07-15 09:32:21.434400] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:12.659 [2024-07-15 09:32:21.434418] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:12.659 [2024-07-15 09:32:21.434463] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:12.660 [2024-07-15 09:32:21.434475] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:12.660 [2024-07-15 09:32:21.434487] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:12.660 BaseBdev1 00:28:12.660 09:32:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.595 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.854 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.854 "name": "raid_bdev1", 00:28:13.854 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:13.854 "strip_size_kb": 0, 00:28:13.854 "state": "online", 00:28:13.854 "raid_level": "raid1", 00:28:13.854 "superblock": true, 00:28:13.854 "num_base_bdevs": 2, 00:28:13.854 "num_base_bdevs_discovered": 1, 00:28:13.854 "num_base_bdevs_operational": 1, 00:28:13.854 "base_bdevs_list": [ 00:28:13.854 { 00:28:13.854 "name": null, 00:28:13.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.854 "is_configured": false, 00:28:13.854 "data_offset": 256, 00:28:13.854 "data_size": 7936 00:28:13.854 }, 00:28:13.854 { 00:28:13.854 "name": "BaseBdev2", 00:28:13.854 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:13.854 "is_configured": true, 00:28:13.854 "data_offset": 256, 00:28:13.854 "data_size": 7936 00:28:13.854 } 00:28:13.854 ] 00:28:13.854 }' 00:28:13.854 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.854 09:32:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:14.789 "name": "raid_bdev1", 00:28:14.789 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:14.789 "strip_size_kb": 0, 00:28:14.789 "state": "online", 00:28:14.789 "raid_level": "raid1", 00:28:14.789 "superblock": true, 00:28:14.789 "num_base_bdevs": 2, 00:28:14.789 "num_base_bdevs_discovered": 1, 00:28:14.789 "num_base_bdevs_operational": 1, 00:28:14.789 "base_bdevs_list": [ 00:28:14.789 { 00:28:14.789 "name": null, 00:28:14.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.789 "is_configured": false, 00:28:14.789 "data_offset": 256, 00:28:14.789 "data_size": 7936 00:28:14.789 }, 00:28:14.789 { 00:28:14.789 "name": "BaseBdev2", 00:28:14.789 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:14.789 "is_configured": true, 00:28:14.789 "data_offset": 256, 00:28:14.789 "data_size": 7936 00:28:14.789 } 00:28:14.789 ] 00:28:14.789 }' 00:28:14.789 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:15.088 09:32:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:15.088 [2024-07-15 09:32:23.980921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:15.088 [2024-07-15 09:32:23.981049] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:15.088 [2024-07-15 09:32:23.981064] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:15.088 request: 00:28:15.088 { 00:28:15.088 "base_bdev": "BaseBdev1", 00:28:15.088 "raid_bdev": "raid_bdev1", 00:28:15.088 "method": "bdev_raid_add_base_bdev", 00:28:15.088 "req_id": 1 00:28:15.088 } 00:28:15.088 Got JSON-RPC error response 00:28:15.088 response: 00:28:15.088 { 00:28:15.088 "code": -22, 00:28:15.088 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:15.088 } 00:28:15.088 09:32:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:15.088 09:32:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:15.088 09:32:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:15.088 09:32:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:15.088 09:32:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.467 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.468 "name": "raid_bdev1", 00:28:16.468 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:16.468 "strip_size_kb": 0, 00:28:16.468 "state": "online", 00:28:16.468 "raid_level": "raid1", 00:28:16.468 "superblock": true, 00:28:16.468 "num_base_bdevs": 2, 00:28:16.468 "num_base_bdevs_discovered": 1, 00:28:16.468 "num_base_bdevs_operational": 1, 00:28:16.468 "base_bdevs_list": [ 00:28:16.468 { 00:28:16.468 "name": null, 00:28:16.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:16.468 "is_configured": false, 00:28:16.468 "data_offset": 256, 00:28:16.468 "data_size": 7936 00:28:16.468 }, 00:28:16.468 { 00:28:16.468 "name": "BaseBdev2", 00:28:16.468 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:16.468 "is_configured": true, 00:28:16.468 "data_offset": 256, 00:28:16.468 "data_size": 7936 00:28:16.468 } 00:28:16.468 ] 00:28:16.468 }' 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.468 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.035 09:32:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.294 "name": "raid_bdev1", 00:28:17.294 "uuid": "00ee178c-ff7c-447d-b8e3-7b29a964e948", 00:28:17.294 "strip_size_kb": 0, 00:28:17.294 "state": "online", 00:28:17.294 "raid_level": "raid1", 00:28:17.294 "superblock": true, 00:28:17.294 "num_base_bdevs": 2, 00:28:17.294 "num_base_bdevs_discovered": 1, 00:28:17.294 "num_base_bdevs_operational": 1, 00:28:17.294 "base_bdevs_list": [ 00:28:17.294 { 00:28:17.294 "name": null, 00:28:17.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.294 "is_configured": false, 00:28:17.294 "data_offset": 256, 00:28:17.294 "data_size": 7936 00:28:17.294 }, 00:28:17.294 { 00:28:17.294 "name": "BaseBdev2", 00:28:17.294 "uuid": "fa7f6333-f55f-5bbf-b204-4dc5af5ddc6a", 00:28:17.294 "is_configured": true, 00:28:17.294 "data_offset": 256, 00:28:17.294 "data_size": 7936 00:28:17.294 } 00:28:17.294 ] 00:28:17.294 }' 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 235545 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 235545 ']' 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 235545 00:28:17.294 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 235545 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 235545' 00:28:17.295 killing process with pid 235545 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 235545 00:28:17.295 Received shutdown signal, test time was about 60.000000 seconds 00:28:17.295 00:28:17.295 Latency(us) 00:28:17.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:17.295 =================================================================================================================== 00:28:17.295 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:17.295 [2024-07-15 09:32:26.246517] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:17.295 [2024-07-15 09:32:26.246604] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:17.295 [2024-07-15 09:32:26.246647] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:17.295 [2024-07-15 09:32:26.246660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16a77c0 name raid_bdev1, state offline 00:28:17.295 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 235545 00:28:17.553 [2024-07-15 09:32:26.280872] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:17.553 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:28:17.553 00:28:17.553 real 0m31.199s 00:28:17.553 user 0m48.594s 00:28:17.553 sys 0m5.034s 00:28:17.553 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.553 09:32:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:17.553 ************************************ 00:28:17.553 END TEST raid_rebuild_test_sb_md_separate 00:28:17.553 ************************************ 00:28:17.812 09:32:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:17.812 09:32:26 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:28:17.812 09:32:26 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:17.812 09:32:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:17.812 09:32:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.812 09:32:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:17.812 ************************************ 00:28:17.812 START TEST raid_state_function_test_sb_md_interleaved 00:28:17.812 ************************************ 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=239957 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 239957' 00:28:17.812 Process raid pid: 239957 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 239957 /var/tmp/spdk-raid.sock 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 239957 ']' 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:17.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:17.812 09:32:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:17.812 [2024-07-15 09:32:26.661687] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:28:17.812 [2024-07-15 09:32:26.661757] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:18.071 [2024-07-15 09:32:26.790747] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:18.071 [2024-07-15 09:32:26.894158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:18.071 [2024-07-15 09:32:26.953652] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:18.071 [2024-07-15 09:32:26.953680] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:18.639 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:18.639 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:18.639 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:18.899 [2024-07-15 09:32:27.814148] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:18.899 [2024-07-15 09:32:27.814192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:18.899 [2024-07-15 09:32:27.814203] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:18.899 [2024-07-15 09:32:27.814215] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.899 09:32:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:19.158 09:32:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.158 "name": "Existed_Raid", 00:28:19.158 "uuid": "dc38689b-857d-4046-b055-1ea295f95036", 00:28:19.158 "strip_size_kb": 0, 00:28:19.158 "state": "configuring", 00:28:19.158 "raid_level": "raid1", 00:28:19.158 "superblock": true, 00:28:19.158 "num_base_bdevs": 2, 00:28:19.158 "num_base_bdevs_discovered": 0, 00:28:19.158 "num_base_bdevs_operational": 2, 00:28:19.158 "base_bdevs_list": [ 00:28:19.158 { 00:28:19.158 "name": "BaseBdev1", 00:28:19.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.158 "is_configured": false, 00:28:19.158 "data_offset": 0, 00:28:19.158 "data_size": 0 00:28:19.158 }, 00:28:19.158 { 00:28:19.158 "name": "BaseBdev2", 00:28:19.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.159 "is_configured": false, 00:28:19.159 "data_offset": 0, 00:28:19.159 "data_size": 0 00:28:19.159 } 00:28:19.159 ] 00:28:19.159 }' 00:28:19.159 09:32:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.159 09:32:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:20.097 09:32:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:20.097 [2024-07-15 09:32:28.844737] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:20.097 [2024-07-15 09:32:28.844773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aefa80 name Existed_Raid, state configuring 00:28:20.097 09:32:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:20.355 [2024-07-15 09:32:29.073362] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:20.355 [2024-07-15 09:32:29.073395] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:20.355 [2024-07-15 09:32:29.073405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:20.355 [2024-07-15 09:32:29.073416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:20.355 [2024-07-15 09:32:29.248862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:20.355 BaseBdev1 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:20.355 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:20.614 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:20.873 [ 00:28:20.873 { 00:28:20.873 "name": "BaseBdev1", 00:28:20.873 "aliases": [ 00:28:20.873 "2496cccf-be4c-4a53-b5dc-7da57e7a7f33" 00:28:20.873 ], 00:28:20.873 "product_name": "Malloc disk", 00:28:20.873 "block_size": 4128, 00:28:20.873 "num_blocks": 8192, 00:28:20.873 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:20.873 "md_size": 32, 00:28:20.873 "md_interleave": true, 00:28:20.873 "dif_type": 0, 00:28:20.873 "assigned_rate_limits": { 00:28:20.873 "rw_ios_per_sec": 0, 00:28:20.873 "rw_mbytes_per_sec": 0, 00:28:20.873 "r_mbytes_per_sec": 0, 00:28:20.873 "w_mbytes_per_sec": 0 00:28:20.873 }, 00:28:20.873 "claimed": true, 00:28:20.873 "claim_type": "exclusive_write", 00:28:20.873 "zoned": false, 00:28:20.873 "supported_io_types": { 00:28:20.873 "read": true, 00:28:20.873 "write": true, 00:28:20.873 "unmap": true, 00:28:20.873 "flush": true, 00:28:20.873 "reset": true, 00:28:20.873 "nvme_admin": false, 00:28:20.873 "nvme_io": false, 00:28:20.873 "nvme_io_md": false, 00:28:20.873 "write_zeroes": true, 00:28:20.873 "zcopy": true, 00:28:20.873 "get_zone_info": false, 00:28:20.873 "zone_management": false, 00:28:20.873 "zone_append": false, 00:28:20.873 "compare": false, 00:28:20.873 "compare_and_write": false, 00:28:20.873 "abort": true, 00:28:20.873 "seek_hole": false, 00:28:20.873 "seek_data": false, 00:28:20.873 "copy": true, 00:28:20.873 "nvme_iov_md": false 00:28:20.873 }, 00:28:20.873 "memory_domains": [ 00:28:20.873 { 00:28:20.873 "dma_device_id": "system", 00:28:20.873 "dma_device_type": 1 00:28:20.873 }, 00:28:20.873 { 00:28:20.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:20.873 "dma_device_type": 2 00:28:20.873 } 00:28:20.873 ], 00:28:20.873 "driver_specific": {} 00:28:20.873 } 00:28:20.873 ] 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.873 "name": "Existed_Raid", 00:28:20.873 "uuid": "901dfec3-c391-477e-84e7-5c59730c18ec", 00:28:20.873 "strip_size_kb": 0, 00:28:20.873 "state": "configuring", 00:28:20.873 "raid_level": "raid1", 00:28:20.873 "superblock": true, 00:28:20.873 "num_base_bdevs": 2, 00:28:20.873 "num_base_bdevs_discovered": 1, 00:28:20.873 "num_base_bdevs_operational": 2, 00:28:20.873 "base_bdevs_list": [ 00:28:20.873 { 00:28:20.873 "name": "BaseBdev1", 00:28:20.873 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:20.873 "is_configured": true, 00:28:20.873 "data_offset": 256, 00:28:20.873 "data_size": 7936 00:28:20.873 }, 00:28:20.873 { 00:28:20.873 "name": "BaseBdev2", 00:28:20.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.873 "is_configured": false, 00:28:20.873 "data_offset": 0, 00:28:20.873 "data_size": 0 00:28:20.873 } 00:28:20.873 ] 00:28:20.873 }' 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.873 09:32:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:21.811 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:21.811 [2024-07-15 09:32:30.636561] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:21.811 [2024-07-15 09:32:30.636610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aef350 name Existed_Raid, state configuring 00:28:21.811 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:22.070 [2024-07-15 09:32:30.809067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:22.070 [2024-07-15 09:32:30.810568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:22.070 [2024-07-15 09:32:30.810602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:22.070 09:32:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.329 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:22.329 "name": "Existed_Raid", 00:28:22.329 "uuid": "992dae24-baca-4ffc-a13f-2baceb0fa5a7", 00:28:22.329 "strip_size_kb": 0, 00:28:22.329 "state": "configuring", 00:28:22.329 "raid_level": "raid1", 00:28:22.329 "superblock": true, 00:28:22.329 "num_base_bdevs": 2, 00:28:22.329 "num_base_bdevs_discovered": 1, 00:28:22.329 "num_base_bdevs_operational": 2, 00:28:22.329 "base_bdevs_list": [ 00:28:22.329 { 00:28:22.329 "name": "BaseBdev1", 00:28:22.329 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:22.329 "is_configured": true, 00:28:22.329 "data_offset": 256, 00:28:22.329 "data_size": 7936 00:28:22.329 }, 00:28:22.329 { 00:28:22.329 "name": "BaseBdev2", 00:28:22.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.329 "is_configured": false, 00:28:22.329 "data_offset": 0, 00:28:22.329 "data_size": 0 00:28:22.329 } 00:28:22.329 ] 00:28:22.329 }' 00:28:22.329 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:22.329 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:22.897 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:23.155 [2024-07-15 09:32:31.911533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:23.155 [2024-07-15 09:32:31.911666] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1af1180 00:28:23.155 [2024-07-15 09:32:31.911679] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:23.155 [2024-07-15 09:32:31.911739] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1af1150 00:28:23.155 [2024-07-15 09:32:31.911814] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1af1180 00:28:23.155 [2024-07-15 09:32:31.911823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1af1180 00:28:23.155 [2024-07-15 09:32:31.911880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:23.155 BaseBdev2 00:28:23.155 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:23.155 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:23.155 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:23.156 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:23.156 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:23.156 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:23.156 09:32:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:23.156 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:23.414 [ 00:28:23.414 { 00:28:23.414 "name": "BaseBdev2", 00:28:23.414 "aliases": [ 00:28:23.414 "33162df7-12d6-4ca6-a413-7e7abf1e0559" 00:28:23.414 ], 00:28:23.414 "product_name": "Malloc disk", 00:28:23.414 "block_size": 4128, 00:28:23.414 "num_blocks": 8192, 00:28:23.414 "uuid": "33162df7-12d6-4ca6-a413-7e7abf1e0559", 00:28:23.414 "md_size": 32, 00:28:23.414 "md_interleave": true, 00:28:23.414 "dif_type": 0, 00:28:23.414 "assigned_rate_limits": { 00:28:23.414 "rw_ios_per_sec": 0, 00:28:23.414 "rw_mbytes_per_sec": 0, 00:28:23.414 "r_mbytes_per_sec": 0, 00:28:23.414 "w_mbytes_per_sec": 0 00:28:23.414 }, 00:28:23.414 "claimed": true, 00:28:23.414 "claim_type": "exclusive_write", 00:28:23.414 "zoned": false, 00:28:23.414 "supported_io_types": { 00:28:23.414 "read": true, 00:28:23.414 "write": true, 00:28:23.414 "unmap": true, 00:28:23.414 "flush": true, 00:28:23.414 "reset": true, 00:28:23.414 "nvme_admin": false, 00:28:23.414 "nvme_io": false, 00:28:23.414 "nvme_io_md": false, 00:28:23.414 "write_zeroes": true, 00:28:23.414 "zcopy": true, 00:28:23.414 "get_zone_info": false, 00:28:23.414 "zone_management": false, 00:28:23.414 "zone_append": false, 00:28:23.414 "compare": false, 00:28:23.414 "compare_and_write": false, 00:28:23.414 "abort": true, 00:28:23.414 "seek_hole": false, 00:28:23.414 "seek_data": false, 00:28:23.414 "copy": true, 00:28:23.414 "nvme_iov_md": false 00:28:23.414 }, 00:28:23.414 "memory_domains": [ 00:28:23.414 { 00:28:23.414 "dma_device_id": "system", 00:28:23.414 "dma_device_type": 1 00:28:23.414 }, 00:28:23.414 { 00:28:23.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:23.414 "dma_device_type": 2 00:28:23.414 } 00:28:23.414 ], 00:28:23.414 "driver_specific": {} 00:28:23.414 } 00:28:23.414 ] 00:28:23.414 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:23.414 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:23.414 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:23.414 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.415 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:23.673 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.673 "name": "Existed_Raid", 00:28:23.673 "uuid": "992dae24-baca-4ffc-a13f-2baceb0fa5a7", 00:28:23.673 "strip_size_kb": 0, 00:28:23.673 "state": "online", 00:28:23.673 "raid_level": "raid1", 00:28:23.673 "superblock": true, 00:28:23.673 "num_base_bdevs": 2, 00:28:23.673 "num_base_bdevs_discovered": 2, 00:28:23.673 "num_base_bdevs_operational": 2, 00:28:23.673 "base_bdevs_list": [ 00:28:23.673 { 00:28:23.673 "name": "BaseBdev1", 00:28:23.673 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:23.673 "is_configured": true, 00:28:23.673 "data_offset": 256, 00:28:23.673 "data_size": 7936 00:28:23.673 }, 00:28:23.673 { 00:28:23.673 "name": "BaseBdev2", 00:28:23.673 "uuid": "33162df7-12d6-4ca6-a413-7e7abf1e0559", 00:28:23.674 "is_configured": true, 00:28:23.674 "data_offset": 256, 00:28:23.674 "data_size": 7936 00:28:23.674 } 00:28:23.674 ] 00:28:23.674 }' 00:28:23.674 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.674 09:32:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:24.611 [2024-07-15 09:32:33.427865] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:24.611 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:24.611 "name": "Existed_Raid", 00:28:24.611 "aliases": [ 00:28:24.611 "992dae24-baca-4ffc-a13f-2baceb0fa5a7" 00:28:24.611 ], 00:28:24.611 "product_name": "Raid Volume", 00:28:24.611 "block_size": 4128, 00:28:24.611 "num_blocks": 7936, 00:28:24.611 "uuid": "992dae24-baca-4ffc-a13f-2baceb0fa5a7", 00:28:24.611 "md_size": 32, 00:28:24.611 "md_interleave": true, 00:28:24.611 "dif_type": 0, 00:28:24.611 "assigned_rate_limits": { 00:28:24.611 "rw_ios_per_sec": 0, 00:28:24.611 "rw_mbytes_per_sec": 0, 00:28:24.611 "r_mbytes_per_sec": 0, 00:28:24.611 "w_mbytes_per_sec": 0 00:28:24.611 }, 00:28:24.611 "claimed": false, 00:28:24.611 "zoned": false, 00:28:24.611 "supported_io_types": { 00:28:24.611 "read": true, 00:28:24.611 "write": true, 00:28:24.611 "unmap": false, 00:28:24.611 "flush": false, 00:28:24.611 "reset": true, 00:28:24.611 "nvme_admin": false, 00:28:24.611 "nvme_io": false, 00:28:24.611 "nvme_io_md": false, 00:28:24.611 "write_zeroes": true, 00:28:24.611 "zcopy": false, 00:28:24.611 "get_zone_info": false, 00:28:24.611 "zone_management": false, 00:28:24.611 "zone_append": false, 00:28:24.611 "compare": false, 00:28:24.611 "compare_and_write": false, 00:28:24.611 "abort": false, 00:28:24.611 "seek_hole": false, 00:28:24.611 "seek_data": false, 00:28:24.611 "copy": false, 00:28:24.611 "nvme_iov_md": false 00:28:24.611 }, 00:28:24.611 "memory_domains": [ 00:28:24.611 { 00:28:24.611 "dma_device_id": "system", 00:28:24.611 "dma_device_type": 1 00:28:24.611 }, 00:28:24.611 { 00:28:24.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:24.611 "dma_device_type": 2 00:28:24.611 }, 00:28:24.611 { 00:28:24.611 "dma_device_id": "system", 00:28:24.611 "dma_device_type": 1 00:28:24.611 }, 00:28:24.611 { 00:28:24.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:24.611 "dma_device_type": 2 00:28:24.611 } 00:28:24.611 ], 00:28:24.611 "driver_specific": { 00:28:24.611 "raid": { 00:28:24.611 "uuid": "992dae24-baca-4ffc-a13f-2baceb0fa5a7", 00:28:24.611 "strip_size_kb": 0, 00:28:24.611 "state": "online", 00:28:24.611 "raid_level": "raid1", 00:28:24.612 "superblock": true, 00:28:24.612 "num_base_bdevs": 2, 00:28:24.612 "num_base_bdevs_discovered": 2, 00:28:24.612 "num_base_bdevs_operational": 2, 00:28:24.612 "base_bdevs_list": [ 00:28:24.612 { 00:28:24.612 "name": "BaseBdev1", 00:28:24.612 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:24.612 "is_configured": true, 00:28:24.612 "data_offset": 256, 00:28:24.612 "data_size": 7936 00:28:24.612 }, 00:28:24.612 { 00:28:24.612 "name": "BaseBdev2", 00:28:24.612 "uuid": "33162df7-12d6-4ca6-a413-7e7abf1e0559", 00:28:24.612 "is_configured": true, 00:28:24.612 "data_offset": 256, 00:28:24.612 "data_size": 7936 00:28:24.612 } 00:28:24.612 ] 00:28:24.612 } 00:28:24.612 } 00:28:24.612 }' 00:28:24.612 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:24.612 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:24.612 BaseBdev2' 00:28:24.612 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:24.612 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:24.612 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:24.871 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:24.871 "name": "BaseBdev1", 00:28:24.871 "aliases": [ 00:28:24.871 "2496cccf-be4c-4a53-b5dc-7da57e7a7f33" 00:28:24.871 ], 00:28:24.871 "product_name": "Malloc disk", 00:28:24.871 "block_size": 4128, 00:28:24.871 "num_blocks": 8192, 00:28:24.871 "uuid": "2496cccf-be4c-4a53-b5dc-7da57e7a7f33", 00:28:24.871 "md_size": 32, 00:28:24.871 "md_interleave": true, 00:28:24.871 "dif_type": 0, 00:28:24.871 "assigned_rate_limits": { 00:28:24.871 "rw_ios_per_sec": 0, 00:28:24.871 "rw_mbytes_per_sec": 0, 00:28:24.871 "r_mbytes_per_sec": 0, 00:28:24.871 "w_mbytes_per_sec": 0 00:28:24.871 }, 00:28:24.871 "claimed": true, 00:28:24.871 "claim_type": "exclusive_write", 00:28:24.871 "zoned": false, 00:28:24.871 "supported_io_types": { 00:28:24.871 "read": true, 00:28:24.871 "write": true, 00:28:24.871 "unmap": true, 00:28:24.871 "flush": true, 00:28:24.871 "reset": true, 00:28:24.871 "nvme_admin": false, 00:28:24.871 "nvme_io": false, 00:28:24.871 "nvme_io_md": false, 00:28:24.871 "write_zeroes": true, 00:28:24.871 "zcopy": true, 00:28:24.871 "get_zone_info": false, 00:28:24.871 "zone_management": false, 00:28:24.871 "zone_append": false, 00:28:24.871 "compare": false, 00:28:24.871 "compare_and_write": false, 00:28:24.871 "abort": true, 00:28:24.871 "seek_hole": false, 00:28:24.871 "seek_data": false, 00:28:24.871 "copy": true, 00:28:24.871 "nvme_iov_md": false 00:28:24.871 }, 00:28:24.871 "memory_domains": [ 00:28:24.871 { 00:28:24.871 "dma_device_id": "system", 00:28:24.871 "dma_device_type": 1 00:28:24.871 }, 00:28:24.871 { 00:28:24.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:24.871 "dma_device_type": 2 00:28:24.871 } 00:28:24.871 ], 00:28:24.871 "driver_specific": {} 00:28:24.871 }' 00:28:24.871 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:24.871 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:25.131 09:32:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:25.131 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:25.131 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:25.131 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:25.389 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:25.389 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:25.389 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:25.389 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:25.648 "name": "BaseBdev2", 00:28:25.648 "aliases": [ 00:28:25.648 "33162df7-12d6-4ca6-a413-7e7abf1e0559" 00:28:25.648 ], 00:28:25.648 "product_name": "Malloc disk", 00:28:25.648 "block_size": 4128, 00:28:25.648 "num_blocks": 8192, 00:28:25.648 "uuid": "33162df7-12d6-4ca6-a413-7e7abf1e0559", 00:28:25.648 "md_size": 32, 00:28:25.648 "md_interleave": true, 00:28:25.648 "dif_type": 0, 00:28:25.648 "assigned_rate_limits": { 00:28:25.648 "rw_ios_per_sec": 0, 00:28:25.648 "rw_mbytes_per_sec": 0, 00:28:25.648 "r_mbytes_per_sec": 0, 00:28:25.648 "w_mbytes_per_sec": 0 00:28:25.648 }, 00:28:25.648 "claimed": true, 00:28:25.648 "claim_type": "exclusive_write", 00:28:25.648 "zoned": false, 00:28:25.648 "supported_io_types": { 00:28:25.648 "read": true, 00:28:25.648 "write": true, 00:28:25.648 "unmap": true, 00:28:25.648 "flush": true, 00:28:25.648 "reset": true, 00:28:25.648 "nvme_admin": false, 00:28:25.648 "nvme_io": false, 00:28:25.648 "nvme_io_md": false, 00:28:25.648 "write_zeroes": true, 00:28:25.648 "zcopy": true, 00:28:25.648 "get_zone_info": false, 00:28:25.648 "zone_management": false, 00:28:25.648 "zone_append": false, 00:28:25.648 "compare": false, 00:28:25.648 "compare_and_write": false, 00:28:25.648 "abort": true, 00:28:25.648 "seek_hole": false, 00:28:25.648 "seek_data": false, 00:28:25.648 "copy": true, 00:28:25.648 "nvme_iov_md": false 00:28:25.648 }, 00:28:25.648 "memory_domains": [ 00:28:25.648 { 00:28:25.648 "dma_device_id": "system", 00:28:25.648 "dma_device_type": 1 00:28:25.648 }, 00:28:25.648 { 00:28:25.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.648 "dma_device_type": 2 00:28:25.648 } 00:28:25.648 ], 00:28:25.648 "driver_specific": {} 00:28:25.648 }' 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:25.648 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:25.907 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:25.907 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:25.907 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:25.907 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:25.907 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:26.164 [2024-07-15 09:32:34.875484] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.164 09:32:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:26.422 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.422 "name": "Existed_Raid", 00:28:26.422 "uuid": "992dae24-baca-4ffc-a13f-2baceb0fa5a7", 00:28:26.422 "strip_size_kb": 0, 00:28:26.422 "state": "online", 00:28:26.422 "raid_level": "raid1", 00:28:26.422 "superblock": true, 00:28:26.422 "num_base_bdevs": 2, 00:28:26.422 "num_base_bdevs_discovered": 1, 00:28:26.422 "num_base_bdevs_operational": 1, 00:28:26.422 "base_bdevs_list": [ 00:28:26.422 { 00:28:26.422 "name": null, 00:28:26.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.422 "is_configured": false, 00:28:26.422 "data_offset": 256, 00:28:26.422 "data_size": 7936 00:28:26.422 }, 00:28:26.422 { 00:28:26.422 "name": "BaseBdev2", 00:28:26.422 "uuid": "33162df7-12d6-4ca6-a413-7e7abf1e0559", 00:28:26.422 "is_configured": true, 00:28:26.422 "data_offset": 256, 00:28:26.422 "data_size": 7936 00:28:26.422 } 00:28:26.422 ] 00:28:26.422 }' 00:28:26.422 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.422 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:26.988 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:26.988 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:26.988 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.988 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:27.247 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:27.247 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:27.247 09:32:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:27.811 [2024-07-15 09:32:36.472825] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:27.811 [2024-07-15 09:32:36.472918] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:27.811 [2024-07-15 09:32:36.486129] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:27.811 [2024-07-15 09:32:36.486167] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:27.811 [2024-07-15 09:32:36.486180] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1af1180 name Existed_Raid, state offline 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 239957 00:28:27.811 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 239957 ']' 00:28:27.812 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 239957 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 239957 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 239957' 00:28:28.069 killing process with pid 239957 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 239957 00:28:28.069 [2024-07-15 09:32:36.814400] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:28.069 09:32:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 239957 00:28:28.069 [2024-07-15 09:32:36.815393] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:28.326 09:32:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:28:28.326 00:28:28.326 real 0m10.436s 00:28:28.326 user 0m18.584s 00:28:28.326 sys 0m1.921s 00:28:28.326 09:32:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:28.326 09:32:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:28.326 ************************************ 00:28:28.326 END TEST raid_state_function_test_sb_md_interleaved 00:28:28.326 ************************************ 00:28:28.326 09:32:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:28.326 09:32:37 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:28:28.326 09:32:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:28.326 09:32:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:28.326 09:32:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:28.326 ************************************ 00:28:28.326 START TEST raid_superblock_test_md_interleaved 00:28:28.326 ************************************ 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=241575 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 241575 /var/tmp/spdk-raid.sock 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 241575 ']' 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:28.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:28.326 09:32:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:28.326 [2024-07-15 09:32:37.167222] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:28:28.326 [2024-07-15 09:32:37.167288] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid241575 ] 00:28:28.583 [2024-07-15 09:32:37.293457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:28.583 [2024-07-15 09:32:37.395101] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:28.583 [2024-07-15 09:32:37.453868] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:28.583 [2024-07-15 09:32:37.453915] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:29.150 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:29.150 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:29.150 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:29.150 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:29.151 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:28:29.719 malloc1 00:28:29.719 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:30.024 [2024-07-15 09:32:38.843919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:30.024 [2024-07-15 09:32:38.843973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.024 [2024-07-15 09:32:38.843994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12854e0 00:28:30.024 [2024-07-15 09:32:38.844007] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.024 [2024-07-15 09:32:38.845567] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.024 [2024-07-15 09:32:38.845596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:30.024 pt1 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:30.024 09:32:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:28:30.622 malloc2 00:28:30.622 09:32:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:30.881 [2024-07-15 09:32:39.611012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:30.881 [2024-07-15 09:32:39.611063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.881 [2024-07-15 09:32:39.611082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126a570 00:28:30.881 [2024-07-15 09:32:39.611095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.881 [2024-07-15 09:32:39.612599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.881 [2024-07-15 09:32:39.612632] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:30.881 pt2 00:28:30.881 09:32:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:30.881 09:32:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:30.881 09:32:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:31.474 [2024-07-15 09:32:40.116344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:31.474 [2024-07-15 09:32:40.117855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:31.474 [2024-07-15 09:32:40.118021] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x126bf20 00:28:31.474 [2024-07-15 09:32:40.118037] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:31.474 [2024-07-15 09:32:40.118114] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10e8050 00:28:31.474 [2024-07-15 09:32:40.118201] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x126bf20 00:28:31.474 [2024-07-15 09:32:40.118211] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x126bf20 00:28:31.474 [2024-07-15 09:32:40.118276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.474 "name": "raid_bdev1", 00:28:31.474 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:31.474 "strip_size_kb": 0, 00:28:31.474 "state": "online", 00:28:31.474 "raid_level": "raid1", 00:28:31.474 "superblock": true, 00:28:31.474 "num_base_bdevs": 2, 00:28:31.474 "num_base_bdevs_discovered": 2, 00:28:31.474 "num_base_bdevs_operational": 2, 00:28:31.474 "base_bdevs_list": [ 00:28:31.474 { 00:28:31.474 "name": "pt1", 00:28:31.474 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:31.474 "is_configured": true, 00:28:31.474 "data_offset": 256, 00:28:31.474 "data_size": 7936 00:28:31.474 }, 00:28:31.474 { 00:28:31.474 "name": "pt2", 00:28:31.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:31.474 "is_configured": true, 00:28:31.474 "data_offset": 256, 00:28:31.474 "data_size": 7936 00:28:31.474 } 00:28:31.474 ] 00:28:31.474 }' 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.474 09:32:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:32.410 [2024-07-15 09:32:41.231523] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:32.410 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:32.410 "name": "raid_bdev1", 00:28:32.410 "aliases": [ 00:28:32.410 "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7" 00:28:32.410 ], 00:28:32.410 "product_name": "Raid Volume", 00:28:32.410 "block_size": 4128, 00:28:32.410 "num_blocks": 7936, 00:28:32.410 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:32.410 "md_size": 32, 00:28:32.410 "md_interleave": true, 00:28:32.410 "dif_type": 0, 00:28:32.410 "assigned_rate_limits": { 00:28:32.410 "rw_ios_per_sec": 0, 00:28:32.410 "rw_mbytes_per_sec": 0, 00:28:32.410 "r_mbytes_per_sec": 0, 00:28:32.410 "w_mbytes_per_sec": 0 00:28:32.410 }, 00:28:32.410 "claimed": false, 00:28:32.410 "zoned": false, 00:28:32.410 "supported_io_types": { 00:28:32.411 "read": true, 00:28:32.411 "write": true, 00:28:32.411 "unmap": false, 00:28:32.411 "flush": false, 00:28:32.411 "reset": true, 00:28:32.411 "nvme_admin": false, 00:28:32.411 "nvme_io": false, 00:28:32.411 "nvme_io_md": false, 00:28:32.411 "write_zeroes": true, 00:28:32.411 "zcopy": false, 00:28:32.411 "get_zone_info": false, 00:28:32.411 "zone_management": false, 00:28:32.411 "zone_append": false, 00:28:32.411 "compare": false, 00:28:32.411 "compare_and_write": false, 00:28:32.411 "abort": false, 00:28:32.411 "seek_hole": false, 00:28:32.411 "seek_data": false, 00:28:32.411 "copy": false, 00:28:32.411 "nvme_iov_md": false 00:28:32.411 }, 00:28:32.411 "memory_domains": [ 00:28:32.411 { 00:28:32.411 "dma_device_id": "system", 00:28:32.411 "dma_device_type": 1 00:28:32.411 }, 00:28:32.411 { 00:28:32.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:32.411 "dma_device_type": 2 00:28:32.411 }, 00:28:32.411 { 00:28:32.411 "dma_device_id": "system", 00:28:32.411 "dma_device_type": 1 00:28:32.411 }, 00:28:32.411 { 00:28:32.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:32.411 "dma_device_type": 2 00:28:32.411 } 00:28:32.411 ], 00:28:32.411 "driver_specific": { 00:28:32.411 "raid": { 00:28:32.411 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:32.411 "strip_size_kb": 0, 00:28:32.411 "state": "online", 00:28:32.411 "raid_level": "raid1", 00:28:32.411 "superblock": true, 00:28:32.411 "num_base_bdevs": 2, 00:28:32.411 "num_base_bdevs_discovered": 2, 00:28:32.411 "num_base_bdevs_operational": 2, 00:28:32.411 "base_bdevs_list": [ 00:28:32.411 { 00:28:32.411 "name": "pt1", 00:28:32.411 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:32.411 "is_configured": true, 00:28:32.411 "data_offset": 256, 00:28:32.411 "data_size": 7936 00:28:32.411 }, 00:28:32.411 { 00:28:32.411 "name": "pt2", 00:28:32.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:32.411 "is_configured": true, 00:28:32.411 "data_offset": 256, 00:28:32.411 "data_size": 7936 00:28:32.411 } 00:28:32.411 ] 00:28:32.411 } 00:28:32.411 } 00:28:32.411 }' 00:28:32.411 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:32.411 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:32.411 pt2' 00:28:32.411 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:32.411 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:32.411 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:32.670 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:32.670 "name": "pt1", 00:28:32.670 "aliases": [ 00:28:32.670 "00000000-0000-0000-0000-000000000001" 00:28:32.670 ], 00:28:32.670 "product_name": "passthru", 00:28:32.670 "block_size": 4128, 00:28:32.670 "num_blocks": 8192, 00:28:32.670 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:32.670 "md_size": 32, 00:28:32.670 "md_interleave": true, 00:28:32.670 "dif_type": 0, 00:28:32.670 "assigned_rate_limits": { 00:28:32.670 "rw_ios_per_sec": 0, 00:28:32.670 "rw_mbytes_per_sec": 0, 00:28:32.670 "r_mbytes_per_sec": 0, 00:28:32.670 "w_mbytes_per_sec": 0 00:28:32.670 }, 00:28:32.670 "claimed": true, 00:28:32.670 "claim_type": "exclusive_write", 00:28:32.670 "zoned": false, 00:28:32.670 "supported_io_types": { 00:28:32.670 "read": true, 00:28:32.670 "write": true, 00:28:32.670 "unmap": true, 00:28:32.670 "flush": true, 00:28:32.670 "reset": true, 00:28:32.670 "nvme_admin": false, 00:28:32.670 "nvme_io": false, 00:28:32.670 "nvme_io_md": false, 00:28:32.670 "write_zeroes": true, 00:28:32.670 "zcopy": true, 00:28:32.670 "get_zone_info": false, 00:28:32.670 "zone_management": false, 00:28:32.670 "zone_append": false, 00:28:32.670 "compare": false, 00:28:32.670 "compare_and_write": false, 00:28:32.670 "abort": true, 00:28:32.670 "seek_hole": false, 00:28:32.670 "seek_data": false, 00:28:32.670 "copy": true, 00:28:32.670 "nvme_iov_md": false 00:28:32.670 }, 00:28:32.670 "memory_domains": [ 00:28:32.670 { 00:28:32.670 "dma_device_id": "system", 00:28:32.670 "dma_device_type": 1 00:28:32.670 }, 00:28:32.670 { 00:28:32.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:32.670 "dma_device_type": 2 00:28:32.670 } 00:28:32.670 ], 00:28:32.670 "driver_specific": { 00:28:32.670 "passthru": { 00:28:32.670 "name": "pt1", 00:28:32.670 "base_bdev_name": "malloc1" 00:28:32.670 } 00:28:32.670 } 00:28:32.670 }' 00:28:32.670 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:32.670 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:32.929 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:33.188 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:33.188 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:33.188 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:33.188 09:32:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:33.188 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:33.188 "name": "pt2", 00:28:33.188 "aliases": [ 00:28:33.188 "00000000-0000-0000-0000-000000000002" 00:28:33.188 ], 00:28:33.188 "product_name": "passthru", 00:28:33.188 "block_size": 4128, 00:28:33.188 "num_blocks": 8192, 00:28:33.188 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:33.188 "md_size": 32, 00:28:33.188 "md_interleave": true, 00:28:33.188 "dif_type": 0, 00:28:33.188 "assigned_rate_limits": { 00:28:33.188 "rw_ios_per_sec": 0, 00:28:33.188 "rw_mbytes_per_sec": 0, 00:28:33.188 "r_mbytes_per_sec": 0, 00:28:33.188 "w_mbytes_per_sec": 0 00:28:33.188 }, 00:28:33.188 "claimed": true, 00:28:33.188 "claim_type": "exclusive_write", 00:28:33.188 "zoned": false, 00:28:33.188 "supported_io_types": { 00:28:33.188 "read": true, 00:28:33.188 "write": true, 00:28:33.188 "unmap": true, 00:28:33.188 "flush": true, 00:28:33.188 "reset": true, 00:28:33.188 "nvme_admin": false, 00:28:33.188 "nvme_io": false, 00:28:33.188 "nvme_io_md": false, 00:28:33.188 "write_zeroes": true, 00:28:33.188 "zcopy": true, 00:28:33.188 "get_zone_info": false, 00:28:33.188 "zone_management": false, 00:28:33.188 "zone_append": false, 00:28:33.188 "compare": false, 00:28:33.188 "compare_and_write": false, 00:28:33.188 "abort": true, 00:28:33.188 "seek_hole": false, 00:28:33.188 "seek_data": false, 00:28:33.188 "copy": true, 00:28:33.188 "nvme_iov_md": false 00:28:33.188 }, 00:28:33.188 "memory_domains": [ 00:28:33.188 { 00:28:33.188 "dma_device_id": "system", 00:28:33.188 "dma_device_type": 1 00:28:33.188 }, 00:28:33.188 { 00:28:33.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.188 "dma_device_type": 2 00:28:33.188 } 00:28:33.188 ], 00:28:33.188 "driver_specific": { 00:28:33.188 "passthru": { 00:28:33.188 "name": "pt2", 00:28:33.188 "base_bdev_name": "malloc2" 00:28:33.188 } 00:28:33.188 } 00:28:33.188 }' 00:28:33.188 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:33.446 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:33.705 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:33.705 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:33.705 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:33.705 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:33.964 [2024-07-15 09:32:42.711456] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:33.964 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 00:28:33.964 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 ']' 00:28:33.964 09:32:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:34.531 [2024-07-15 09:32:43.212531] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:34.531 [2024-07-15 09:32:43.212556] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:34.531 [2024-07-15 09:32:43.212613] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:34.531 [2024-07-15 09:32:43.212670] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:34.532 [2024-07-15 09:32:43.212682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x126bf20 name raid_bdev1, state offline 00:28:34.532 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.532 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:34.789 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:35.047 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:35.047 09:32:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:35.305 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:35.306 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:35.564 [2024-07-15 09:32:44.455747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:35.564 [2024-07-15 09:32:44.457090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:35.564 [2024-07-15 09:32:44.457147] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:35.564 [2024-07-15 09:32:44.457186] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:35.564 [2024-07-15 09:32:44.457205] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:35.564 [2024-07-15 09:32:44.457215] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1276260 name raid_bdev1, state configuring 00:28:35.564 request: 00:28:35.564 { 00:28:35.564 "name": "raid_bdev1", 00:28:35.564 "raid_level": "raid1", 00:28:35.564 "base_bdevs": [ 00:28:35.564 "malloc1", 00:28:35.564 "malloc2" 00:28:35.564 ], 00:28:35.564 "superblock": false, 00:28:35.564 "method": "bdev_raid_create", 00:28:35.564 "req_id": 1 00:28:35.564 } 00:28:35.564 Got JSON-RPC error response 00:28:35.564 response: 00:28:35.564 { 00:28:35.564 "code": -17, 00:28:35.564 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:35.564 } 00:28:35.564 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:35.564 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:35.564 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:35.564 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:35.565 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.565 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:35.824 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:35.824 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:35.824 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:36.082 [2024-07-15 09:32:44.949020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:36.082 [2024-07-15 09:32:44.949066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:36.082 [2024-07-15 09:32:44.949084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126d000 00:28:36.082 [2024-07-15 09:32:44.949096] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:36.082 [2024-07-15 09:32:44.950560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:36.082 [2024-07-15 09:32:44.950588] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:36.082 [2024-07-15 09:32:44.950635] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:36.083 [2024-07-15 09:32:44.950661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:36.083 pt1 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.083 09:32:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.341 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.341 "name": "raid_bdev1", 00:28:36.341 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:36.341 "strip_size_kb": 0, 00:28:36.341 "state": "configuring", 00:28:36.341 "raid_level": "raid1", 00:28:36.341 "superblock": true, 00:28:36.341 "num_base_bdevs": 2, 00:28:36.342 "num_base_bdevs_discovered": 1, 00:28:36.342 "num_base_bdevs_operational": 2, 00:28:36.342 "base_bdevs_list": [ 00:28:36.342 { 00:28:36.342 "name": "pt1", 00:28:36.342 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:36.342 "is_configured": true, 00:28:36.342 "data_offset": 256, 00:28:36.342 "data_size": 7936 00:28:36.342 }, 00:28:36.342 { 00:28:36.342 "name": null, 00:28:36.342 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:36.342 "is_configured": false, 00:28:36.342 "data_offset": 256, 00:28:36.342 "data_size": 7936 00:28:36.342 } 00:28:36.342 ] 00:28:36.342 }' 00:28:36.342 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.342 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:36.910 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:36.910 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:36.910 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:36.910 09:32:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:37.169 [2024-07-15 09:32:46.019861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:37.169 [2024-07-15 09:32:46.019914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:37.169 [2024-07-15 09:32:46.019941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126f270 00:28:37.169 [2024-07-15 09:32:46.019954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:37.169 [2024-07-15 09:32:46.020120] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:37.169 [2024-07-15 09:32:46.020136] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:37.169 [2024-07-15 09:32:46.020180] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:37.169 [2024-07-15 09:32:46.020198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:37.169 [2024-07-15 09:32:46.020280] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10e8c10 00:28:37.169 [2024-07-15 09:32:46.020291] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:37.169 [2024-07-15 09:32:46.020344] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x126ad40 00:28:37.169 [2024-07-15 09:32:46.020417] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10e8c10 00:28:37.169 [2024-07-15 09:32:46.020427] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10e8c10 00:28:37.169 [2024-07-15 09:32:46.020484] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:37.169 pt2 00:28:37.169 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:37.169 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.170 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.428 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:37.428 "name": "raid_bdev1", 00:28:37.428 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:37.428 "strip_size_kb": 0, 00:28:37.428 "state": "online", 00:28:37.428 "raid_level": "raid1", 00:28:37.428 "superblock": true, 00:28:37.428 "num_base_bdevs": 2, 00:28:37.428 "num_base_bdevs_discovered": 2, 00:28:37.428 "num_base_bdevs_operational": 2, 00:28:37.428 "base_bdevs_list": [ 00:28:37.428 { 00:28:37.428 "name": "pt1", 00:28:37.428 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:37.428 "is_configured": true, 00:28:37.428 "data_offset": 256, 00:28:37.428 "data_size": 7936 00:28:37.428 }, 00:28:37.428 { 00:28:37.428 "name": "pt2", 00:28:37.428 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:37.428 "is_configured": true, 00:28:37.428 "data_offset": 256, 00:28:37.428 "data_size": 7936 00:28:37.428 } 00:28:37.428 ] 00:28:37.428 }' 00:28:37.428 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:37.428 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:37.995 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:37.995 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:37.995 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:37.996 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:37.996 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:37.996 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:37.996 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:37.996 09:32:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:38.256 [2024-07-15 09:32:47.123057] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:38.256 "name": "raid_bdev1", 00:28:38.256 "aliases": [ 00:28:38.256 "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7" 00:28:38.256 ], 00:28:38.256 "product_name": "Raid Volume", 00:28:38.256 "block_size": 4128, 00:28:38.256 "num_blocks": 7936, 00:28:38.256 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:38.256 "md_size": 32, 00:28:38.256 "md_interleave": true, 00:28:38.256 "dif_type": 0, 00:28:38.256 "assigned_rate_limits": { 00:28:38.256 "rw_ios_per_sec": 0, 00:28:38.256 "rw_mbytes_per_sec": 0, 00:28:38.256 "r_mbytes_per_sec": 0, 00:28:38.256 "w_mbytes_per_sec": 0 00:28:38.256 }, 00:28:38.256 "claimed": false, 00:28:38.256 "zoned": false, 00:28:38.256 "supported_io_types": { 00:28:38.256 "read": true, 00:28:38.256 "write": true, 00:28:38.256 "unmap": false, 00:28:38.256 "flush": false, 00:28:38.256 "reset": true, 00:28:38.256 "nvme_admin": false, 00:28:38.256 "nvme_io": false, 00:28:38.256 "nvme_io_md": false, 00:28:38.256 "write_zeroes": true, 00:28:38.256 "zcopy": false, 00:28:38.256 "get_zone_info": false, 00:28:38.256 "zone_management": false, 00:28:38.256 "zone_append": false, 00:28:38.256 "compare": false, 00:28:38.256 "compare_and_write": false, 00:28:38.256 "abort": false, 00:28:38.256 "seek_hole": false, 00:28:38.256 "seek_data": false, 00:28:38.256 "copy": false, 00:28:38.256 "nvme_iov_md": false 00:28:38.256 }, 00:28:38.256 "memory_domains": [ 00:28:38.256 { 00:28:38.256 "dma_device_id": "system", 00:28:38.256 "dma_device_type": 1 00:28:38.256 }, 00:28:38.256 { 00:28:38.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:38.256 "dma_device_type": 2 00:28:38.256 }, 00:28:38.256 { 00:28:38.256 "dma_device_id": "system", 00:28:38.256 "dma_device_type": 1 00:28:38.256 }, 00:28:38.256 { 00:28:38.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:38.256 "dma_device_type": 2 00:28:38.256 } 00:28:38.256 ], 00:28:38.256 "driver_specific": { 00:28:38.256 "raid": { 00:28:38.256 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:38.256 "strip_size_kb": 0, 00:28:38.256 "state": "online", 00:28:38.256 "raid_level": "raid1", 00:28:38.256 "superblock": true, 00:28:38.256 "num_base_bdevs": 2, 00:28:38.256 "num_base_bdevs_discovered": 2, 00:28:38.256 "num_base_bdevs_operational": 2, 00:28:38.256 "base_bdevs_list": [ 00:28:38.256 { 00:28:38.256 "name": "pt1", 00:28:38.256 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:38.256 "is_configured": true, 00:28:38.256 "data_offset": 256, 00:28:38.256 "data_size": 7936 00:28:38.256 }, 00:28:38.256 { 00:28:38.256 "name": "pt2", 00:28:38.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:38.256 "is_configured": true, 00:28:38.256 "data_offset": 256, 00:28:38.256 "data_size": 7936 00:28:38.256 } 00:28:38.256 ] 00:28:38.256 } 00:28:38.256 } 00:28:38.256 }' 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:38.256 pt2' 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:38.256 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:38.516 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:38.516 "name": "pt1", 00:28:38.516 "aliases": [ 00:28:38.516 "00000000-0000-0000-0000-000000000001" 00:28:38.516 ], 00:28:38.516 "product_name": "passthru", 00:28:38.516 "block_size": 4128, 00:28:38.516 "num_blocks": 8192, 00:28:38.516 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:38.516 "md_size": 32, 00:28:38.516 "md_interleave": true, 00:28:38.516 "dif_type": 0, 00:28:38.516 "assigned_rate_limits": { 00:28:38.516 "rw_ios_per_sec": 0, 00:28:38.516 "rw_mbytes_per_sec": 0, 00:28:38.516 "r_mbytes_per_sec": 0, 00:28:38.516 "w_mbytes_per_sec": 0 00:28:38.516 }, 00:28:38.516 "claimed": true, 00:28:38.516 "claim_type": "exclusive_write", 00:28:38.516 "zoned": false, 00:28:38.516 "supported_io_types": { 00:28:38.516 "read": true, 00:28:38.516 "write": true, 00:28:38.516 "unmap": true, 00:28:38.516 "flush": true, 00:28:38.516 "reset": true, 00:28:38.516 "nvme_admin": false, 00:28:38.516 "nvme_io": false, 00:28:38.516 "nvme_io_md": false, 00:28:38.516 "write_zeroes": true, 00:28:38.516 "zcopy": true, 00:28:38.516 "get_zone_info": false, 00:28:38.516 "zone_management": false, 00:28:38.516 "zone_append": false, 00:28:38.516 "compare": false, 00:28:38.516 "compare_and_write": false, 00:28:38.516 "abort": true, 00:28:38.516 "seek_hole": false, 00:28:38.516 "seek_data": false, 00:28:38.516 "copy": true, 00:28:38.516 "nvme_iov_md": false 00:28:38.516 }, 00:28:38.516 "memory_domains": [ 00:28:38.516 { 00:28:38.516 "dma_device_id": "system", 00:28:38.516 "dma_device_type": 1 00:28:38.516 }, 00:28:38.516 { 00:28:38.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:38.516 "dma_device_type": 2 00:28:38.516 } 00:28:38.516 ], 00:28:38.516 "driver_specific": { 00:28:38.516 "passthru": { 00:28:38.516 "name": "pt1", 00:28:38.516 "base_bdev_name": "malloc1" 00:28:38.516 } 00:28:38.516 } 00:28:38.516 }' 00:28:38.516 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:38.776 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.036 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.036 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:39.036 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.036 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:39.036 09:32:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:39.296 "name": "pt2", 00:28:39.296 "aliases": [ 00:28:39.296 "00000000-0000-0000-0000-000000000002" 00:28:39.296 ], 00:28:39.296 "product_name": "passthru", 00:28:39.296 "block_size": 4128, 00:28:39.296 "num_blocks": 8192, 00:28:39.296 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.296 "md_size": 32, 00:28:39.296 "md_interleave": true, 00:28:39.296 "dif_type": 0, 00:28:39.296 "assigned_rate_limits": { 00:28:39.296 "rw_ios_per_sec": 0, 00:28:39.296 "rw_mbytes_per_sec": 0, 00:28:39.296 "r_mbytes_per_sec": 0, 00:28:39.296 "w_mbytes_per_sec": 0 00:28:39.296 }, 00:28:39.296 "claimed": true, 00:28:39.296 "claim_type": "exclusive_write", 00:28:39.296 "zoned": false, 00:28:39.296 "supported_io_types": { 00:28:39.296 "read": true, 00:28:39.296 "write": true, 00:28:39.296 "unmap": true, 00:28:39.296 "flush": true, 00:28:39.296 "reset": true, 00:28:39.296 "nvme_admin": false, 00:28:39.296 "nvme_io": false, 00:28:39.296 "nvme_io_md": false, 00:28:39.296 "write_zeroes": true, 00:28:39.296 "zcopy": true, 00:28:39.296 "get_zone_info": false, 00:28:39.296 "zone_management": false, 00:28:39.296 "zone_append": false, 00:28:39.296 "compare": false, 00:28:39.296 "compare_and_write": false, 00:28:39.296 "abort": true, 00:28:39.296 "seek_hole": false, 00:28:39.296 "seek_data": false, 00:28:39.296 "copy": true, 00:28:39.296 "nvme_iov_md": false 00:28:39.296 }, 00:28:39.296 "memory_domains": [ 00:28:39.296 { 00:28:39.296 "dma_device_id": "system", 00:28:39.296 "dma_device_type": 1 00:28:39.296 }, 00:28:39.296 { 00:28:39.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.296 "dma_device_type": 2 00:28:39.296 } 00:28:39.296 ], 00:28:39.296 "driver_specific": { 00:28:39.296 "passthru": { 00:28:39.296 "name": "pt2", 00:28:39.296 "base_bdev_name": "malloc2" 00:28:39.296 } 00:28:39.296 } 00:28:39.296 }' 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.296 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:39.556 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:39.814 [2024-07-15 09:32:48.602985] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:39.814 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 '!=' a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 ']' 00:28:39.814 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:39.814 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:39.814 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:39.814 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:40.073 [2024-07-15 09:32:48.847386] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.073 09:32:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.332 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:40.332 "name": "raid_bdev1", 00:28:40.332 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:40.332 "strip_size_kb": 0, 00:28:40.332 "state": "online", 00:28:40.332 "raid_level": "raid1", 00:28:40.332 "superblock": true, 00:28:40.332 "num_base_bdevs": 2, 00:28:40.332 "num_base_bdevs_discovered": 1, 00:28:40.332 "num_base_bdevs_operational": 1, 00:28:40.332 "base_bdevs_list": [ 00:28:40.332 { 00:28:40.332 "name": null, 00:28:40.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:40.332 "is_configured": false, 00:28:40.332 "data_offset": 256, 00:28:40.332 "data_size": 7936 00:28:40.332 }, 00:28:40.332 { 00:28:40.332 "name": "pt2", 00:28:40.332 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.332 "is_configured": true, 00:28:40.332 "data_offset": 256, 00:28:40.332 "data_size": 7936 00:28:40.332 } 00:28:40.332 ] 00:28:40.332 }' 00:28:40.332 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:40.332 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:40.899 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:41.157 [2024-07-15 09:32:49.882108] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:41.158 [2024-07-15 09:32:49.882136] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:41.158 [2024-07-15 09:32:49.882185] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:41.158 [2024-07-15 09:32:49.882229] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:41.158 [2024-07-15 09:32:49.882242] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e8c10 name raid_bdev1, state offline 00:28:41.158 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.158 09:32:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:41.158 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:41.158 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:41.158 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:41.158 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:41.158 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:28:41.417 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:41.676 [2024-07-15 09:32:50.503734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:41.676 [2024-07-15 09:32:50.503780] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:41.676 [2024-07-15 09:32:50.503799] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126d9f0 00:28:41.676 [2024-07-15 09:32:50.503811] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:41.676 [2024-07-15 09:32:50.505235] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:41.676 [2024-07-15 09:32:50.505263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:41.676 [2024-07-15 09:32:50.505309] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:41.676 [2024-07-15 09:32:50.505334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:41.676 [2024-07-15 09:32:50.505409] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x126eea0 00:28:41.676 [2024-07-15 09:32:50.505420] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:41.676 [2024-07-15 09:32:50.505475] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x126cbc0 00:28:41.676 [2024-07-15 09:32:50.505548] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x126eea0 00:28:41.676 [2024-07-15 09:32:50.505557] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x126eea0 00:28:41.676 [2024-07-15 09:32:50.505613] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:41.676 pt2 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.676 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.935 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.935 "name": "raid_bdev1", 00:28:41.935 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:41.935 "strip_size_kb": 0, 00:28:41.935 "state": "online", 00:28:41.935 "raid_level": "raid1", 00:28:41.935 "superblock": true, 00:28:41.935 "num_base_bdevs": 2, 00:28:41.935 "num_base_bdevs_discovered": 1, 00:28:41.935 "num_base_bdevs_operational": 1, 00:28:41.935 "base_bdevs_list": [ 00:28:41.935 { 00:28:41.935 "name": null, 00:28:41.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.935 "is_configured": false, 00:28:41.935 "data_offset": 256, 00:28:41.935 "data_size": 7936 00:28:41.935 }, 00:28:41.935 { 00:28:41.935 "name": "pt2", 00:28:41.935 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:41.935 "is_configured": true, 00:28:41.935 "data_offset": 256, 00:28:41.935 "data_size": 7936 00:28:41.935 } 00:28:41.935 ] 00:28:41.935 }' 00:28:41.935 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.935 09:32:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:42.502 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:42.761 [2024-07-15 09:32:51.538464] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:42.761 [2024-07-15 09:32:51.538491] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:42.761 [2024-07-15 09:32:51.538540] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:42.761 [2024-07-15 09:32:51.538583] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:42.761 [2024-07-15 09:32:51.538595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x126eea0 name raid_bdev1, state offline 00:28:42.761 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.761 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:43.020 [2024-07-15 09:32:51.895402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:43.020 [2024-07-15 09:32:51.895443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.020 [2024-07-15 09:32:51.895460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126d620 00:28:43.020 [2024-07-15 09:32:51.895473] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.020 [2024-07-15 09:32:51.896887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.020 [2024-07-15 09:32:51.896914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:43.020 [2024-07-15 09:32:51.896969] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:43.020 [2024-07-15 09:32:51.896994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:43.020 [2024-07-15 09:32:51.897071] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:43.020 [2024-07-15 09:32:51.897084] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:43.020 [2024-07-15 09:32:51.897097] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x126f640 name raid_bdev1, state configuring 00:28:43.020 [2024-07-15 09:32:51.897119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:43.020 [2024-07-15 09:32:51.897170] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x126f640 00:28:43.020 [2024-07-15 09:32:51.897181] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:43.020 [2024-07-15 09:32:51.897231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x126e810 00:28:43.020 [2024-07-15 09:32:51.897301] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x126f640 00:28:43.020 [2024-07-15 09:32:51.897311] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x126f640 00:28:43.020 [2024-07-15 09:32:51.897368] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.020 pt1 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.020 09:32:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.280 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.280 "name": "raid_bdev1", 00:28:43.280 "uuid": "a3a2c25b-f761-469d-8536-8a2f4ac7c7b7", 00:28:43.280 "strip_size_kb": 0, 00:28:43.280 "state": "online", 00:28:43.280 "raid_level": "raid1", 00:28:43.280 "superblock": true, 00:28:43.280 "num_base_bdevs": 2, 00:28:43.280 "num_base_bdevs_discovered": 1, 00:28:43.280 "num_base_bdevs_operational": 1, 00:28:43.280 "base_bdevs_list": [ 00:28:43.280 { 00:28:43.280 "name": null, 00:28:43.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:43.280 "is_configured": false, 00:28:43.280 "data_offset": 256, 00:28:43.280 "data_size": 7936 00:28:43.280 }, 00:28:43.280 { 00:28:43.280 "name": "pt2", 00:28:43.280 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.280 "is_configured": true, 00:28:43.280 "data_offset": 256, 00:28:43.280 "data_size": 7936 00:28:43.280 } 00:28:43.280 ] 00:28:43.280 }' 00:28:43.280 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.280 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:43.849 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:43.849 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:44.109 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:44.109 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:44.109 09:32:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:44.459 [2024-07-15 09:32:53.167008] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 '!=' a3a2c25b-f761-469d-8536-8a2f4ac7c7b7 ']' 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 241575 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 241575 ']' 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 241575 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 241575 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 241575' 00:28:44.459 killing process with pid 241575 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 241575 00:28:44.459 [2024-07-15 09:32:53.239053] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:44.459 [2024-07-15 09:32:53.239106] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:44.459 [2024-07-15 09:32:53.239151] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:44.459 [2024-07-15 09:32:53.239163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x126f640 name raid_bdev1, state offline 00:28:44.459 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 241575 00:28:44.459 [2024-07-15 09:32:53.255972] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:44.719 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:28:44.719 00:28:44.719 real 0m16.349s 00:28:44.719 user 0m29.676s 00:28:44.719 sys 0m2.974s 00:28:44.719 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:44.719 09:32:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:44.719 ************************************ 00:28:44.719 END TEST raid_superblock_test_md_interleaved 00:28:44.719 ************************************ 00:28:44.719 09:32:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:44.719 09:32:53 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:28:44.719 09:32:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:44.719 09:32:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:44.719 09:32:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:44.719 ************************************ 00:28:44.719 START TEST raid_rebuild_test_sb_md_interleaved 00:28:44.719 ************************************ 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=243990 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 243990 /var/tmp/spdk-raid.sock 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:44.719 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 243990 ']' 00:28:44.720 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:44.720 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:44.720 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:44.720 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:44.720 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:44.720 09:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:44.720 [2024-07-15 09:32:53.606433] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:28:44.720 [2024-07-15 09:32:53.606495] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid243990 ] 00:28:44.720 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:44.720 Zero copy mechanism will not be used. 00:28:44.978 [2024-07-15 09:32:53.735934] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:44.978 [2024-07-15 09:32:53.845045] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:44.978 [2024-07-15 09:32:53.905893] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:44.978 [2024-07-15 09:32:53.905941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:45.237 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:45.237 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:45.237 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:45.237 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:28:45.496 BaseBdev1_malloc 00:28:45.496 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:45.755 [2024-07-15 09:32:54.534668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:45.755 [2024-07-15 09:32:54.534717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:45.755 [2024-07-15 09:32:54.534743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b59ce0 00:28:45.755 [2024-07-15 09:32:54.534756] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:45.755 [2024-07-15 09:32:54.536325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:45.755 [2024-07-15 09:32:54.536366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:45.755 BaseBdev1 00:28:45.755 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:45.755 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:28:46.014 BaseBdev2_malloc 00:28:46.014 09:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:46.274 [2024-07-15 09:32:55.045264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:46.274 [2024-07-15 09:32:55.045310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:46.274 [2024-07-15 09:32:55.045334] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b512d0 00:28:46.274 [2024-07-15 09:32:55.045352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:46.274 [2024-07-15 09:32:55.047142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:46.274 [2024-07-15 09:32:55.047171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:46.274 BaseBdev2 00:28:46.274 09:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:28:46.533 spare_malloc 00:28:46.533 09:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:46.792 spare_delay 00:28:46.792 09:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:47.051 [2024-07-15 09:32:55.792048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:47.051 [2024-07-15 09:32:55.792095] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:47.051 [2024-07-15 09:32:55.792119] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b54070 00:28:47.051 [2024-07-15 09:32:55.792131] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:47.051 [2024-07-15 09:32:55.793449] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:47.051 [2024-07-15 09:32:55.793475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:47.051 spare 00:28:47.051 09:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:47.311 [2024-07-15 09:32:56.036725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:47.311 [2024-07-15 09:32:56.038053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:47.311 [2024-07-15 09:32:56.038217] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b56370 00:28:47.311 [2024-07-15 09:32:56.038231] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:47.311 [2024-07-15 09:32:56.038306] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19bc9c0 00:28:47.311 [2024-07-15 09:32:56.038392] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b56370 00:28:47.311 [2024-07-15 09:32:56.038402] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b56370 00:28:47.311 [2024-07-15 09:32:56.038460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.311 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.880 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.881 "name": "raid_bdev1", 00:28:47.881 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:47.881 "strip_size_kb": 0, 00:28:47.881 "state": "online", 00:28:47.881 "raid_level": "raid1", 00:28:47.881 "superblock": true, 00:28:47.881 "num_base_bdevs": 2, 00:28:47.881 "num_base_bdevs_discovered": 2, 00:28:47.881 "num_base_bdevs_operational": 2, 00:28:47.881 "base_bdevs_list": [ 00:28:47.881 { 00:28:47.881 "name": "BaseBdev1", 00:28:47.881 "uuid": "a9381053-7e73-5e5f-bffa-a6fcc578cfd6", 00:28:47.881 "is_configured": true, 00:28:47.881 "data_offset": 256, 00:28:47.881 "data_size": 7936 00:28:47.881 }, 00:28:47.881 { 00:28:47.881 "name": "BaseBdev2", 00:28:47.881 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:47.881 "is_configured": true, 00:28:47.881 "data_offset": 256, 00:28:47.881 "data_size": 7936 00:28:47.881 } 00:28:47.881 ] 00:28:47.881 }' 00:28:47.881 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.881 09:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.448 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:48.448 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:48.448 [2024-07-15 09:32:57.400670] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:48.707 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:48.707 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.707 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:48.967 [2024-07-15 09:32:57.897741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.967 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.226 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.226 09:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.226 09:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:49.226 "name": "raid_bdev1", 00:28:49.226 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:49.226 "strip_size_kb": 0, 00:28:49.226 "state": "online", 00:28:49.226 "raid_level": "raid1", 00:28:49.226 "superblock": true, 00:28:49.226 "num_base_bdevs": 2, 00:28:49.226 "num_base_bdevs_discovered": 1, 00:28:49.226 "num_base_bdevs_operational": 1, 00:28:49.226 "base_bdevs_list": [ 00:28:49.226 { 00:28:49.226 "name": null, 00:28:49.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.226 "is_configured": false, 00:28:49.226 "data_offset": 256, 00:28:49.226 "data_size": 7936 00:28:49.226 }, 00:28:49.226 { 00:28:49.226 "name": "BaseBdev2", 00:28:49.226 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:49.226 "is_configured": true, 00:28:49.226 "data_offset": 256, 00:28:49.226 "data_size": 7936 00:28:49.226 } 00:28:49.226 ] 00:28:49.226 }' 00:28:49.226 09:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:49.226 09:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:50.164 09:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:50.422 [2024-07-15 09:32:59.257357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:50.422 [2024-07-15 09:32:59.261000] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b56250 00:28:50.422 [2024-07-15 09:32:59.263000] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:50.422 09:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.355 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.612 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.612 "name": "raid_bdev1", 00:28:51.612 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:51.612 "strip_size_kb": 0, 00:28:51.612 "state": "online", 00:28:51.612 "raid_level": "raid1", 00:28:51.612 "superblock": true, 00:28:51.612 "num_base_bdevs": 2, 00:28:51.612 "num_base_bdevs_discovered": 2, 00:28:51.612 "num_base_bdevs_operational": 2, 00:28:51.612 "process": { 00:28:51.612 "type": "rebuild", 00:28:51.612 "target": "spare", 00:28:51.612 "progress": { 00:28:51.612 "blocks": 3072, 00:28:51.612 "percent": 38 00:28:51.612 } 00:28:51.612 }, 00:28:51.612 "base_bdevs_list": [ 00:28:51.612 { 00:28:51.612 "name": "spare", 00:28:51.612 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:51.613 "is_configured": true, 00:28:51.613 "data_offset": 256, 00:28:51.613 "data_size": 7936 00:28:51.613 }, 00:28:51.613 { 00:28:51.613 "name": "BaseBdev2", 00:28:51.613 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:51.613 "is_configured": true, 00:28:51.613 "data_offset": 256, 00:28:51.613 "data_size": 7936 00:28:51.613 } 00:28:51.613 ] 00:28:51.613 }' 00:28:51.613 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.871 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:51.871 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:51.871 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:51.871 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:52.130 [2024-07-15 09:33:00.852036] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:52.130 [2024-07-15 09:33:00.875291] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:52.130 [2024-07-15 09:33:00.875334] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:52.130 [2024-07-15 09:33:00.875350] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:52.130 [2024-07-15 09:33:00.875359] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.130 09:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:52.388 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.388 "name": "raid_bdev1", 00:28:52.388 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:52.388 "strip_size_kb": 0, 00:28:52.388 "state": "online", 00:28:52.388 "raid_level": "raid1", 00:28:52.388 "superblock": true, 00:28:52.388 "num_base_bdevs": 2, 00:28:52.388 "num_base_bdevs_discovered": 1, 00:28:52.388 "num_base_bdevs_operational": 1, 00:28:52.388 "base_bdevs_list": [ 00:28:52.388 { 00:28:52.388 "name": null, 00:28:52.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.388 "is_configured": false, 00:28:52.388 "data_offset": 256, 00:28:52.388 "data_size": 7936 00:28:52.388 }, 00:28:52.388 { 00:28:52.388 "name": "BaseBdev2", 00:28:52.388 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:52.388 "is_configured": true, 00:28:52.388 "data_offset": 256, 00:28:52.388 "data_size": 7936 00:28:52.388 } 00:28:52.388 ] 00:28:52.388 }' 00:28:52.388 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.388 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.953 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.212 09:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.212 "name": "raid_bdev1", 00:28:53.212 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:53.212 "strip_size_kb": 0, 00:28:53.212 "state": "online", 00:28:53.212 "raid_level": "raid1", 00:28:53.212 "superblock": true, 00:28:53.212 "num_base_bdevs": 2, 00:28:53.212 "num_base_bdevs_discovered": 1, 00:28:53.212 "num_base_bdevs_operational": 1, 00:28:53.212 "base_bdevs_list": [ 00:28:53.212 { 00:28:53.212 "name": null, 00:28:53.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.212 "is_configured": false, 00:28:53.212 "data_offset": 256, 00:28:53.212 "data_size": 7936 00:28:53.212 }, 00:28:53.212 { 00:28:53.212 "name": "BaseBdev2", 00:28:53.212 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:53.212 "is_configured": true, 00:28:53.212 "data_offset": 256, 00:28:53.212 "data_size": 7936 00:28:53.212 } 00:28:53.212 ] 00:28:53.212 }' 00:28:53.212 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.212 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:53.212 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.212 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:53.212 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:53.471 [2024-07-15 09:33:02.307238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:53.471 [2024-07-15 09:33:02.310826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b52270 00:28:53.471 [2024-07-15 09:33:02.312262] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:53.471 09:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.407 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.665 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:54.665 "name": "raid_bdev1", 00:28:54.665 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:54.665 "strip_size_kb": 0, 00:28:54.665 "state": "online", 00:28:54.665 "raid_level": "raid1", 00:28:54.665 "superblock": true, 00:28:54.665 "num_base_bdevs": 2, 00:28:54.665 "num_base_bdevs_discovered": 2, 00:28:54.665 "num_base_bdevs_operational": 2, 00:28:54.665 "process": { 00:28:54.665 "type": "rebuild", 00:28:54.665 "target": "spare", 00:28:54.665 "progress": { 00:28:54.665 "blocks": 2816, 00:28:54.665 "percent": 35 00:28:54.665 } 00:28:54.665 }, 00:28:54.665 "base_bdevs_list": [ 00:28:54.665 { 00:28:54.665 "name": "spare", 00:28:54.665 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:54.665 "is_configured": true, 00:28:54.665 "data_offset": 256, 00:28:54.665 "data_size": 7936 00:28:54.665 }, 00:28:54.665 { 00:28:54.665 "name": "BaseBdev2", 00:28:54.665 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:54.665 "is_configured": true, 00:28:54.665 "data_offset": 256, 00:28:54.665 "data_size": 7936 00:28:54.665 } 00:28:54.665 ] 00:28:54.665 }' 00:28:54.665 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:54.665 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:54.665 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:54.924 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:54.925 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1127 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:54.925 "name": "raid_bdev1", 00:28:54.925 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:54.925 "strip_size_kb": 0, 00:28:54.925 "state": "online", 00:28:54.925 "raid_level": "raid1", 00:28:54.925 "superblock": true, 00:28:54.925 "num_base_bdevs": 2, 00:28:54.925 "num_base_bdevs_discovered": 2, 00:28:54.925 "num_base_bdevs_operational": 2, 00:28:54.925 "process": { 00:28:54.925 "type": "rebuild", 00:28:54.925 "target": "spare", 00:28:54.925 "progress": { 00:28:54.925 "blocks": 3840, 00:28:54.925 "percent": 48 00:28:54.925 } 00:28:54.925 }, 00:28:54.925 "base_bdevs_list": [ 00:28:54.925 { 00:28:54.925 "name": "spare", 00:28:54.925 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:54.925 "is_configured": true, 00:28:54.925 "data_offset": 256, 00:28:54.925 "data_size": 7936 00:28:54.925 }, 00:28:54.925 { 00:28:54.925 "name": "BaseBdev2", 00:28:54.925 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:54.925 "is_configured": true, 00:28:54.925 "data_offset": 256, 00:28:54.925 "data_size": 7936 00:28:54.925 } 00:28:54.925 ] 00:28:54.925 }' 00:28:54.925 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:55.184 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:55.184 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:55.184 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:55.184 09:33:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.121 09:33:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:56.380 "name": "raid_bdev1", 00:28:56.380 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:56.380 "strip_size_kb": 0, 00:28:56.380 "state": "online", 00:28:56.380 "raid_level": "raid1", 00:28:56.380 "superblock": true, 00:28:56.380 "num_base_bdevs": 2, 00:28:56.380 "num_base_bdevs_discovered": 2, 00:28:56.380 "num_base_bdevs_operational": 2, 00:28:56.380 "process": { 00:28:56.380 "type": "rebuild", 00:28:56.380 "target": "spare", 00:28:56.380 "progress": { 00:28:56.380 "blocks": 7168, 00:28:56.380 "percent": 90 00:28:56.380 } 00:28:56.380 }, 00:28:56.380 "base_bdevs_list": [ 00:28:56.380 { 00:28:56.380 "name": "spare", 00:28:56.380 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:56.380 "is_configured": true, 00:28:56.380 "data_offset": 256, 00:28:56.380 "data_size": 7936 00:28:56.380 }, 00:28:56.380 { 00:28:56.380 "name": "BaseBdev2", 00:28:56.380 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:56.380 "is_configured": true, 00:28:56.380 "data_offset": 256, 00:28:56.380 "data_size": 7936 00:28:56.380 } 00:28:56.380 ] 00:28:56.380 }' 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:56.380 09:33:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:56.639 [2024-07-15 09:33:05.436553] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:56.639 [2024-07-15 09:33:05.436611] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:56.639 [2024-07-15 09:33:05.436701] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.576 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:57.835 "name": "raid_bdev1", 00:28:57.835 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:57.835 "strip_size_kb": 0, 00:28:57.835 "state": "online", 00:28:57.835 "raid_level": "raid1", 00:28:57.835 "superblock": true, 00:28:57.835 "num_base_bdevs": 2, 00:28:57.835 "num_base_bdevs_discovered": 2, 00:28:57.835 "num_base_bdevs_operational": 2, 00:28:57.835 "base_bdevs_list": [ 00:28:57.835 { 00:28:57.835 "name": "spare", 00:28:57.835 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:57.835 "is_configured": true, 00:28:57.835 "data_offset": 256, 00:28:57.835 "data_size": 7936 00:28:57.835 }, 00:28:57.835 { 00:28:57.835 "name": "BaseBdev2", 00:28:57.835 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:57.835 "is_configured": true, 00:28:57.835 "data_offset": 256, 00:28:57.835 "data_size": 7936 00:28:57.835 } 00:28:57.835 ] 00:28:57.835 }' 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.835 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:58.094 "name": "raid_bdev1", 00:28:58.094 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:58.094 "strip_size_kb": 0, 00:28:58.094 "state": "online", 00:28:58.094 "raid_level": "raid1", 00:28:58.094 "superblock": true, 00:28:58.094 "num_base_bdevs": 2, 00:28:58.094 "num_base_bdevs_discovered": 2, 00:28:58.094 "num_base_bdevs_operational": 2, 00:28:58.094 "base_bdevs_list": [ 00:28:58.094 { 00:28:58.094 "name": "spare", 00:28:58.094 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:58.094 "is_configured": true, 00:28:58.094 "data_offset": 256, 00:28:58.094 "data_size": 7936 00:28:58.094 }, 00:28:58.094 { 00:28:58.094 "name": "BaseBdev2", 00:28:58.094 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:58.094 "is_configured": true, 00:28:58.094 "data_offset": 256, 00:28:58.094 "data_size": 7936 00:28:58.094 } 00:28:58.094 ] 00:28:58.094 }' 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.094 09:33:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.353 09:33:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.353 "name": "raid_bdev1", 00:28:58.353 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:28:58.353 "strip_size_kb": 0, 00:28:58.353 "state": "online", 00:28:58.353 "raid_level": "raid1", 00:28:58.353 "superblock": true, 00:28:58.353 "num_base_bdevs": 2, 00:28:58.353 "num_base_bdevs_discovered": 2, 00:28:58.353 "num_base_bdevs_operational": 2, 00:28:58.353 "base_bdevs_list": [ 00:28:58.353 { 00:28:58.353 "name": "spare", 00:28:58.353 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:28:58.353 "is_configured": true, 00:28:58.353 "data_offset": 256, 00:28:58.353 "data_size": 7936 00:28:58.353 }, 00:28:58.353 { 00:28:58.353 "name": "BaseBdev2", 00:28:58.353 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:28:58.353 "is_configured": true, 00:28:58.353 "data_offset": 256, 00:28:58.353 "data_size": 7936 00:28:58.353 } 00:28:58.353 ] 00:28:58.353 }' 00:28:58.353 09:33:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.353 09:33:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:58.921 09:33:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:59.183 [2024-07-15 09:33:08.052230] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:59.183 [2024-07-15 09:33:08.052260] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:59.183 [2024-07-15 09:33:08.052317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:59.183 [2024-07-15 09:33:08.052372] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:59.183 [2024-07-15 09:33:08.052384] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b56370 name raid_bdev1, state offline 00:28:59.183 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:28:59.183 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.520 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:59.520 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:28:59.520 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:59.520 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:59.780 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:00.040 [2024-07-15 09:33:08.794159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:00.040 [2024-07-15 09:33:08.794207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.040 [2024-07-15 09:33:08.794230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b56040 00:29:00.040 [2024-07-15 09:33:08.794243] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.040 [2024-07-15 09:33:08.795742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.040 [2024-07-15 09:33:08.795769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:00.040 [2024-07-15 09:33:08.795829] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:00.040 [2024-07-15 09:33:08.795854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:00.040 [2024-07-15 09:33:08.795952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:00.040 spare 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.040 09:33:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.040 [2024-07-15 09:33:08.896260] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b56f60 00:29:00.040 [2024-07-15 09:33:08.896279] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:00.040 [2024-07-15 09:33:08.896360] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b56de0 00:29:00.040 [2024-07-15 09:33:08.896450] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b56f60 00:29:00.040 [2024-07-15 09:33:08.896460] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b56f60 00:29:00.040 [2024-07-15 09:33:08.896530] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:00.300 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:00.300 "name": "raid_bdev1", 00:29:00.300 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:00.300 "strip_size_kb": 0, 00:29:00.300 "state": "online", 00:29:00.300 "raid_level": "raid1", 00:29:00.300 "superblock": true, 00:29:00.300 "num_base_bdevs": 2, 00:29:00.300 "num_base_bdevs_discovered": 2, 00:29:00.300 "num_base_bdevs_operational": 2, 00:29:00.300 "base_bdevs_list": [ 00:29:00.300 { 00:29:00.300 "name": "spare", 00:29:00.300 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:29:00.300 "is_configured": true, 00:29:00.300 "data_offset": 256, 00:29:00.300 "data_size": 7936 00:29:00.300 }, 00:29:00.300 { 00:29:00.300 "name": "BaseBdev2", 00:29:00.300 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:00.300 "is_configured": true, 00:29:00.300 "data_offset": 256, 00:29:00.300 "data_size": 7936 00:29:00.300 } 00:29:00.300 ] 00:29:00.300 }' 00:29:00.300 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:00.300 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.867 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.127 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.127 "name": "raid_bdev1", 00:29:01.127 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:01.127 "strip_size_kb": 0, 00:29:01.127 "state": "online", 00:29:01.127 "raid_level": "raid1", 00:29:01.127 "superblock": true, 00:29:01.127 "num_base_bdevs": 2, 00:29:01.127 "num_base_bdevs_discovered": 2, 00:29:01.127 "num_base_bdevs_operational": 2, 00:29:01.127 "base_bdevs_list": [ 00:29:01.127 { 00:29:01.127 "name": "spare", 00:29:01.127 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:29:01.127 "is_configured": true, 00:29:01.127 "data_offset": 256, 00:29:01.127 "data_size": 7936 00:29:01.127 }, 00:29:01.127 { 00:29:01.127 "name": "BaseBdev2", 00:29:01.127 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:01.127 "is_configured": true, 00:29:01.127 "data_offset": 256, 00:29:01.127 "data_size": 7936 00:29:01.127 } 00:29:01.127 ] 00:29:01.127 }' 00:29:01.127 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.127 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:01.127 09:33:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.127 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:01.127 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:01.127 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.387 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:01.387 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:01.645 [2024-07-15 09:33:10.478861] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.645 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.904 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.904 "name": "raid_bdev1", 00:29:01.904 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:01.904 "strip_size_kb": 0, 00:29:01.904 "state": "online", 00:29:01.904 "raid_level": "raid1", 00:29:01.904 "superblock": true, 00:29:01.904 "num_base_bdevs": 2, 00:29:01.904 "num_base_bdevs_discovered": 1, 00:29:01.904 "num_base_bdevs_operational": 1, 00:29:01.904 "base_bdevs_list": [ 00:29:01.904 { 00:29:01.904 "name": null, 00:29:01.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.904 "is_configured": false, 00:29:01.904 "data_offset": 256, 00:29:01.904 "data_size": 7936 00:29:01.904 }, 00:29:01.904 { 00:29:01.904 "name": "BaseBdev2", 00:29:01.904 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:01.904 "is_configured": true, 00:29:01.904 "data_offset": 256, 00:29:01.904 "data_size": 7936 00:29:01.904 } 00:29:01.904 ] 00:29:01.904 }' 00:29:01.904 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.904 09:33:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:02.471 09:33:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:02.730 [2024-07-15 09:33:11.593837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:02.730 [2024-07-15 09:33:11.593991] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:02.730 [2024-07-15 09:33:11.594009] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:02.730 [2024-07-15 09:33:11.594036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:02.730 [2024-07-15 09:33:11.597477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b574e0 00:29:02.730 [2024-07-15 09:33:11.598881] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:02.730 09:33:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:04.108 "name": "raid_bdev1", 00:29:04.108 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:04.108 "strip_size_kb": 0, 00:29:04.108 "state": "online", 00:29:04.108 "raid_level": "raid1", 00:29:04.108 "superblock": true, 00:29:04.108 "num_base_bdevs": 2, 00:29:04.108 "num_base_bdevs_discovered": 2, 00:29:04.108 "num_base_bdevs_operational": 2, 00:29:04.108 "process": { 00:29:04.108 "type": "rebuild", 00:29:04.108 "target": "spare", 00:29:04.108 "progress": { 00:29:04.108 "blocks": 3072, 00:29:04.108 "percent": 38 00:29:04.108 } 00:29:04.108 }, 00:29:04.108 "base_bdevs_list": [ 00:29:04.108 { 00:29:04.108 "name": "spare", 00:29:04.108 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:29:04.108 "is_configured": true, 00:29:04.108 "data_offset": 256, 00:29:04.108 "data_size": 7936 00:29:04.108 }, 00:29:04.108 { 00:29:04.108 "name": "BaseBdev2", 00:29:04.108 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:04.108 "is_configured": true, 00:29:04.108 "data_offset": 256, 00:29:04.108 "data_size": 7936 00:29:04.108 } 00:29:04.108 ] 00:29:04.108 }' 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:04.108 09:33:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:04.367 [2024-07-15 09:33:13.192033] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:04.367 [2024-07-15 09:33:13.211448] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:04.367 [2024-07-15 09:33:13.211490] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:04.367 [2024-07-15 09:33:13.211505] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:04.367 [2024-07-15 09:33:13.211514] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.367 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.626 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:04.626 "name": "raid_bdev1", 00:29:04.626 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:04.626 "strip_size_kb": 0, 00:29:04.626 "state": "online", 00:29:04.626 "raid_level": "raid1", 00:29:04.626 "superblock": true, 00:29:04.626 "num_base_bdevs": 2, 00:29:04.626 "num_base_bdevs_discovered": 1, 00:29:04.626 "num_base_bdevs_operational": 1, 00:29:04.626 "base_bdevs_list": [ 00:29:04.626 { 00:29:04.626 "name": null, 00:29:04.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:04.626 "is_configured": false, 00:29:04.626 "data_offset": 256, 00:29:04.626 "data_size": 7936 00:29:04.626 }, 00:29:04.626 { 00:29:04.626 "name": "BaseBdev2", 00:29:04.626 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:04.626 "is_configured": true, 00:29:04.626 "data_offset": 256, 00:29:04.626 "data_size": 7936 00:29:04.626 } 00:29:04.626 ] 00:29:04.626 }' 00:29:04.626 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:04.626 09:33:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:05.195 09:33:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:05.455 [2024-07-15 09:33:14.306406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:05.455 [2024-07-15 09:33:14.306458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:05.455 [2024-07-15 09:33:14.306482] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b55c80 00:29:05.455 [2024-07-15 09:33:14.306495] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:05.455 [2024-07-15 09:33:14.306695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:05.455 [2024-07-15 09:33:14.306711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:05.455 [2024-07-15 09:33:14.306767] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:05.455 [2024-07-15 09:33:14.306780] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:05.455 [2024-07-15 09:33:14.306791] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:05.455 [2024-07-15 09:33:14.306808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:05.455 [2024-07-15 09:33:14.310804] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b47ad0 00:29:05.455 spare 00:29:05.455 [2024-07-15 09:33:14.312225] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:05.455 09:33:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.393 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.652 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:06.652 "name": "raid_bdev1", 00:29:06.652 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:06.652 "strip_size_kb": 0, 00:29:06.652 "state": "online", 00:29:06.652 "raid_level": "raid1", 00:29:06.652 "superblock": true, 00:29:06.652 "num_base_bdevs": 2, 00:29:06.652 "num_base_bdevs_discovered": 2, 00:29:06.652 "num_base_bdevs_operational": 2, 00:29:06.652 "process": { 00:29:06.652 "type": "rebuild", 00:29:06.652 "target": "spare", 00:29:06.652 "progress": { 00:29:06.652 "blocks": 3072, 00:29:06.652 "percent": 38 00:29:06.652 } 00:29:06.652 }, 00:29:06.652 "base_bdevs_list": [ 00:29:06.652 { 00:29:06.652 "name": "spare", 00:29:06.652 "uuid": "f670f86e-5526-50d5-983f-e153c8aa6510", 00:29:06.652 "is_configured": true, 00:29:06.652 "data_offset": 256, 00:29:06.652 "data_size": 7936 00:29:06.652 }, 00:29:06.652 { 00:29:06.652 "name": "BaseBdev2", 00:29:06.652 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:06.652 "is_configured": true, 00:29:06.652 "data_offset": 256, 00:29:06.652 "data_size": 7936 00:29:06.652 } 00:29:06.652 ] 00:29:06.652 }' 00:29:06.652 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:06.911 [2024-07-15 09:33:15.798727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.911 [2024-07-15 09:33:15.824127] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:06.911 [2024-07-15 09:33:15.824168] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.911 [2024-07-15 09:33:15.824183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.911 [2024-07-15 09:33:15.824191] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.911 09:33:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.169 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.169 "name": "raid_bdev1", 00:29:07.169 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:07.169 "strip_size_kb": 0, 00:29:07.169 "state": "online", 00:29:07.169 "raid_level": "raid1", 00:29:07.169 "superblock": true, 00:29:07.169 "num_base_bdevs": 2, 00:29:07.169 "num_base_bdevs_discovered": 1, 00:29:07.169 "num_base_bdevs_operational": 1, 00:29:07.169 "base_bdevs_list": [ 00:29:07.169 { 00:29:07.169 "name": null, 00:29:07.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.170 "is_configured": false, 00:29:07.170 "data_offset": 256, 00:29:07.170 "data_size": 7936 00:29:07.170 }, 00:29:07.170 { 00:29:07.170 "name": "BaseBdev2", 00:29:07.170 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:07.170 "is_configured": true, 00:29:07.170 "data_offset": 256, 00:29:07.170 "data_size": 7936 00:29:07.170 } 00:29:07.170 ] 00:29:07.170 }' 00:29:07.170 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.170 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:07.737 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:07.737 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:07.737 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:07.737 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:07.737 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:07.995 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.995 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.995 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.995 "name": "raid_bdev1", 00:29:07.995 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:07.995 "strip_size_kb": 0, 00:29:07.995 "state": "online", 00:29:07.995 "raid_level": "raid1", 00:29:07.995 "superblock": true, 00:29:07.995 "num_base_bdevs": 2, 00:29:07.995 "num_base_bdevs_discovered": 1, 00:29:07.995 "num_base_bdevs_operational": 1, 00:29:07.995 "base_bdevs_list": [ 00:29:07.995 { 00:29:07.995 "name": null, 00:29:07.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.995 "is_configured": false, 00:29:07.995 "data_offset": 256, 00:29:07.995 "data_size": 7936 00:29:07.995 }, 00:29:07.995 { 00:29:07.995 "name": "BaseBdev2", 00:29:07.995 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:07.995 "is_configured": true, 00:29:07.995 "data_offset": 256, 00:29:07.995 "data_size": 7936 00:29:07.995 } 00:29:07.995 ] 00:29:07.995 }' 00:29:07.995 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:08.253 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:08.253 09:33:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:08.253 09:33:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:08.253 09:33:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:08.511 09:33:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:08.770 [2024-07-15 09:33:17.500403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:08.770 [2024-07-15 09:33:17.500448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:08.770 [2024-07-15 09:33:17.500469] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b572e0 00:29:08.770 [2024-07-15 09:33:17.500482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:08.770 [2024-07-15 09:33:17.500643] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:08.770 [2024-07-15 09:33:17.500659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:08.770 [2024-07-15 09:33:17.500703] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:08.770 [2024-07-15 09:33:17.500715] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:08.770 [2024-07-15 09:33:17.500725] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:08.770 BaseBdev1 00:29:08.770 09:33:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.705 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.964 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:09.964 "name": "raid_bdev1", 00:29:09.964 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:09.964 "strip_size_kb": 0, 00:29:09.964 "state": "online", 00:29:09.964 "raid_level": "raid1", 00:29:09.964 "superblock": true, 00:29:09.964 "num_base_bdevs": 2, 00:29:09.964 "num_base_bdevs_discovered": 1, 00:29:09.964 "num_base_bdevs_operational": 1, 00:29:09.964 "base_bdevs_list": [ 00:29:09.964 { 00:29:09.964 "name": null, 00:29:09.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:09.964 "is_configured": false, 00:29:09.964 "data_offset": 256, 00:29:09.964 "data_size": 7936 00:29:09.964 }, 00:29:09.964 { 00:29:09.964 "name": "BaseBdev2", 00:29:09.964 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:09.964 "is_configured": true, 00:29:09.964 "data_offset": 256, 00:29:09.964 "data_size": 7936 00:29:09.964 } 00:29:09.964 ] 00:29:09.964 }' 00:29:09.964 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:09.964 09:33:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.530 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:10.788 "name": "raid_bdev1", 00:29:10.788 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:10.788 "strip_size_kb": 0, 00:29:10.788 "state": "online", 00:29:10.788 "raid_level": "raid1", 00:29:10.788 "superblock": true, 00:29:10.788 "num_base_bdevs": 2, 00:29:10.788 "num_base_bdevs_discovered": 1, 00:29:10.788 "num_base_bdevs_operational": 1, 00:29:10.788 "base_bdevs_list": [ 00:29:10.788 { 00:29:10.788 "name": null, 00:29:10.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:10.788 "is_configured": false, 00:29:10.788 "data_offset": 256, 00:29:10.788 "data_size": 7936 00:29:10.788 }, 00:29:10.788 { 00:29:10.788 "name": "BaseBdev2", 00:29:10.788 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:10.788 "is_configured": true, 00:29:10.788 "data_offset": 256, 00:29:10.788 "data_size": 7936 00:29:10.788 } 00:29:10.788 ] 00:29:10.788 }' 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:10.788 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:11.046 [2024-07-15 09:33:19.902832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:11.046 [2024-07-15 09:33:19.902968] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:11.046 [2024-07-15 09:33:19.902984] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:11.046 request: 00:29:11.046 { 00:29:11.046 "base_bdev": "BaseBdev1", 00:29:11.046 "raid_bdev": "raid_bdev1", 00:29:11.046 "method": "bdev_raid_add_base_bdev", 00:29:11.046 "req_id": 1 00:29:11.046 } 00:29:11.046 Got JSON-RPC error response 00:29:11.046 response: 00:29:11.046 { 00:29:11.046 "code": -22, 00:29:11.046 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:11.046 } 00:29:11.046 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:11.046 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:11.046 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:11.046 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:11.046 09:33:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.983 09:33:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.549 09:33:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.549 "name": "raid_bdev1", 00:29:12.549 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:12.549 "strip_size_kb": 0, 00:29:12.549 "state": "online", 00:29:12.549 "raid_level": "raid1", 00:29:12.549 "superblock": true, 00:29:12.549 "num_base_bdevs": 2, 00:29:12.549 "num_base_bdevs_discovered": 1, 00:29:12.549 "num_base_bdevs_operational": 1, 00:29:12.549 "base_bdevs_list": [ 00:29:12.549 { 00:29:12.549 "name": null, 00:29:12.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.549 "is_configured": false, 00:29:12.549 "data_offset": 256, 00:29:12.549 "data_size": 7936 00:29:12.550 }, 00:29:12.550 { 00:29:12.550 "name": "BaseBdev2", 00:29:12.550 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:12.550 "is_configured": true, 00:29:12.550 "data_offset": 256, 00:29:12.550 "data_size": 7936 00:29:12.550 } 00:29:12.550 ] 00:29:12.550 }' 00:29:12.550 09:33:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.550 09:33:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.115 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.682 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:13.682 "name": "raid_bdev1", 00:29:13.682 "uuid": "2d2f61d8-6770-44ae-8c46-0ebca1c10b8b", 00:29:13.682 "strip_size_kb": 0, 00:29:13.682 "state": "online", 00:29:13.682 "raid_level": "raid1", 00:29:13.682 "superblock": true, 00:29:13.682 "num_base_bdevs": 2, 00:29:13.682 "num_base_bdevs_discovered": 1, 00:29:13.682 "num_base_bdevs_operational": 1, 00:29:13.682 "base_bdevs_list": [ 00:29:13.682 { 00:29:13.682 "name": null, 00:29:13.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:13.682 "is_configured": false, 00:29:13.682 "data_offset": 256, 00:29:13.682 "data_size": 7936 00:29:13.682 }, 00:29:13.682 { 00:29:13.682 "name": "BaseBdev2", 00:29:13.682 "uuid": "9c5694bd-7a4c-5c3e-be92-a1f2630a3926", 00:29:13.682 "is_configured": true, 00:29:13.682 "data_offset": 256, 00:29:13.682 "data_size": 7936 00:29:13.682 } 00:29:13.682 ] 00:29:13.682 }' 00:29:13.682 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:13.682 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:13.682 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.940 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 243990 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 243990 ']' 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 243990 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 243990 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 243990' 00:29:13.941 killing process with pid 243990 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 243990 00:29:13.941 Received shutdown signal, test time was about 60.000000 seconds 00:29:13.941 00:29:13.941 Latency(us) 00:29:13.941 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:13.941 =================================================================================================================== 00:29:13.941 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:13.941 [2024-07-15 09:33:22.677835] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:13.941 [2024-07-15 09:33:22.677920] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:13.941 [2024-07-15 09:33:22.677970] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:13.941 [2024-07-15 09:33:22.677982] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b56f60 name raid_bdev1, state offline 00:29:13.941 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 243990 00:29:13.941 [2024-07-15 09:33:22.705158] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:14.203 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:14.203 00:29:14.203 real 0m29.374s 00:29:14.203 user 0m47.448s 00:29:14.203 sys 0m3.936s 00:29:14.203 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:14.203 09:33:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:14.203 ************************************ 00:29:14.203 END TEST raid_rebuild_test_sb_md_interleaved 00:29:14.203 ************************************ 00:29:14.203 09:33:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:14.203 09:33:22 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:14.203 09:33:22 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:14.203 09:33:22 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 243990 ']' 00:29:14.203 09:33:22 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 243990 00:29:14.203 09:33:22 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:14.203 00:29:14.203 real 18m36.783s 00:29:14.203 user 31m31.908s 00:29:14.203 sys 3m22.570s 00:29:14.203 09:33:23 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:14.203 09:33:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:14.203 ************************************ 00:29:14.203 END TEST bdev_raid 00:29:14.203 ************************************ 00:29:14.203 09:33:23 -- common/autotest_common.sh@1142 -- # return 0 00:29:14.203 09:33:23 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:14.203 09:33:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:14.203 09:33:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:14.203 09:33:23 -- common/autotest_common.sh@10 -- # set +x 00:29:14.203 ************************************ 00:29:14.203 START TEST bdevperf_config 00:29:14.203 ************************************ 00:29:14.203 09:33:23 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:14.475 * Looking for test storage... 00:29:14.475 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:14.475 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:14.475 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:14.475 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:14.475 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:14.475 09:33:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:14.476 00:29:14.476 09:33:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:14.476 09:33:23 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:17.770 09:33:25 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 09:33:23.283901] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:17.770 [2024-07-15 09:33:23.283984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid248654 ] 00:29:17.770 Using job config with 4 jobs 00:29:17.770 [2024-07-15 09:33:23.419779] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:17.771 [2024-07-15 09:33:23.542645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:17.771 cpumask for '\''job0'\'' is too big 00:29:17.771 cpumask for '\''job1'\'' is too big 00:29:17.771 cpumask for '\''job2'\'' is too big 00:29:17.771 cpumask for '\''job3'\'' is too big 00:29:17.771 Running I/O for 2 seconds... 00:29:17.771 00:29:17.771 Latency(us) 00:29:17.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24059.95 23.50 0.00 0.00 10625.67 1866.35 16412.49 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24038.00 23.47 0.00 0.00 10611.46 1866.35 14474.91 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 24016.12 23.45 0.00 0.00 10596.90 1852.10 12594.31 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 23994.40 23.43 0.00 0.00 10582.81 1852.10 10884.67 00:29:17.771 =================================================================================================================== 00:29:17.771 Total : 96108.47 93.86 0.00 0.00 10604.21 1852.10 16412.49' 00:29:17.771 09:33:25 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 09:33:23.283901] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:17.771 [2024-07-15 09:33:23.283984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid248654 ] 00:29:17.771 Using job config with 4 jobs 00:29:17.771 [2024-07-15 09:33:23.419779] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:17.771 [2024-07-15 09:33:23.542645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:17.771 cpumask for '\''job0'\'' is too big 00:29:17.771 cpumask for '\''job1'\'' is too big 00:29:17.771 cpumask for '\''job2'\'' is too big 00:29:17.771 cpumask for '\''job3'\'' is too big 00:29:17.771 Running I/O for 2 seconds... 00:29:17.771 00:29:17.771 Latency(us) 00:29:17.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24059.95 23.50 0.00 0.00 10625.67 1866.35 16412.49 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24038.00 23.47 0.00 0.00 10611.46 1866.35 14474.91 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 24016.12 23.45 0.00 0.00 10596.90 1852.10 12594.31 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 23994.40 23.43 0.00 0.00 10582.81 1852.10 10884.67 00:29:17.771 =================================================================================================================== 00:29:17.771 Total : 96108.47 93.86 0.00 0.00 10604.21 1852.10 16412.49' 00:29:17.771 09:33:25 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 09:33:23.283901] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:17.771 [2024-07-15 09:33:23.283984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid248654 ] 00:29:17.771 Using job config with 4 jobs 00:29:17.771 [2024-07-15 09:33:23.419779] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:17.771 [2024-07-15 09:33:23.542645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:17.771 cpumask for '\''job0'\'' is too big 00:29:17.771 cpumask for '\''job1'\'' is too big 00:29:17.771 cpumask for '\''job2'\'' is too big 00:29:17.771 cpumask for '\''job3'\'' is too big 00:29:17.771 Running I/O for 2 seconds... 00:29:17.771 00:29:17.771 Latency(us) 00:29:17.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24059.95 23.50 0.00 0.00 10625.67 1866.35 16412.49 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.02 24038.00 23.47 0.00 0.00 10611.46 1866.35 14474.91 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 24016.12 23.45 0.00 0.00 10596.90 1852.10 12594.31 00:29:17.771 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:17.771 Malloc0 : 2.03 23994.40 23.43 0.00 0.00 10582.81 1852.10 10884.67 00:29:17.771 =================================================================================================================== 00:29:17.771 Total : 96108.47 93.86 0.00 0.00 10604.21 1852.10 16412.49' 00:29:17.771 09:33:25 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:17.771 09:33:25 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:17.771 09:33:26 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:17.771 09:33:26 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:17.771 [2024-07-15 09:33:26.070541] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:17.771 [2024-07-15 09:33:26.070610] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249008 ] 00:29:17.771 [2024-07-15 09:33:26.211715] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:17.771 [2024-07-15 09:33:26.324206] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:17.771 cpumask for 'job0' is too big 00:29:17.771 cpumask for 'job1' is too big 00:29:17.771 cpumask for 'job2' is too big 00:29:17.771 cpumask for 'job3' is too big 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:20.308 Running I/O for 2 seconds... 00:29:20.308 00:29:20.308 Latency(us) 00:29:20.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:20.308 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:20.308 Malloc0 : 2.01 24171.97 23.61 0.00 0.00 10572.16 1880.60 16298.52 00:29:20.308 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:20.308 Malloc0 : 2.02 24181.16 23.61 0.00 0.00 10544.13 1852.10 14360.93 00:29:20.308 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:20.308 Malloc0 : 2.02 24159.03 23.59 0.00 0.00 10530.01 1837.86 12537.32 00:29:20.308 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:20.308 Malloc0 : 2.03 24137.10 23.57 0.00 0.00 10516.22 1837.86 10884.67 00:29:20.308 =================================================================================================================== 00:29:20.308 Total : 96649.27 94.38 0.00 0.00 10540.59 1837.86 16298.52' 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:20.308 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:20.308 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:20.308 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:20.308 09:33:28 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:22.846 09:33:31 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 09:33:28.802973] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:22.846 [2024-07-15 09:33:28.803041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249367 ] 00:29:22.846 Using job config with 3 jobs 00:29:22.846 [2024-07-15 09:33:28.944667] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:22.846 [2024-07-15 09:33:29.050481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.846 cpumask for '\''job0'\'' is too big 00:29:22.846 cpumask for '\''job1'\'' is too big 00:29:22.846 cpumask for '\''job2'\'' is too big 00:29:22.846 Running I/O for 2 seconds... 00:29:22.846 00:29:22.846 Latency(us) 00:29:22.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.01 32487.23 31.73 0.00 0.00 7874.47 1802.24 11511.54 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.02 32499.10 31.74 0.00 0.00 7854.09 1780.87 9744.92 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.02 32469.41 31.71 0.00 0.00 7843.89 1780.87 9061.06 00:29:22.846 =================================================================================================================== 00:29:22.846 Total : 97455.74 95.17 0.00 0.00 7857.46 1780.87 11511.54' 00:29:22.846 09:33:31 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 09:33:28.802973] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:22.846 [2024-07-15 09:33:28.803041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249367 ] 00:29:22.846 Using job config with 3 jobs 00:29:22.846 [2024-07-15 09:33:28.944667] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:22.846 [2024-07-15 09:33:29.050481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.846 cpumask for '\''job0'\'' is too big 00:29:22.846 cpumask for '\''job1'\'' is too big 00:29:22.846 cpumask for '\''job2'\'' is too big 00:29:22.846 Running I/O for 2 seconds... 00:29:22.846 00:29:22.846 Latency(us) 00:29:22.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.01 32487.23 31.73 0.00 0.00 7874.47 1802.24 11511.54 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.02 32499.10 31.74 0.00 0.00 7854.09 1780.87 9744.92 00:29:22.846 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.846 Malloc0 : 2.02 32469.41 31.71 0.00 0.00 7843.89 1780.87 9061.06 00:29:22.846 =================================================================================================================== 00:29:22.846 Total : 97455.74 95.17 0.00 0.00 7857.46 1780.87 11511.54' 00:29:22.846 09:33:31 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 09:33:28.802973] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:22.847 [2024-07-15 09:33:28.803041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249367 ] 00:29:22.847 Using job config with 3 jobs 00:29:22.847 [2024-07-15 09:33:28.944667] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:22.847 [2024-07-15 09:33:29.050481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.847 cpumask for '\''job0'\'' is too big 00:29:22.847 cpumask for '\''job1'\'' is too big 00:29:22.847 cpumask for '\''job2'\'' is too big 00:29:22.847 Running I/O for 2 seconds... 00:29:22.847 00:29:22.847 Latency(us) 00:29:22.847 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:22.847 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.847 Malloc0 : 2.01 32487.23 31.73 0.00 0.00 7874.47 1802.24 11511.54 00:29:22.847 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.847 Malloc0 : 2.02 32499.10 31.74 0.00 0.00 7854.09 1780.87 9744.92 00:29:22.847 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:22.847 Malloc0 : 2.02 32469.41 31.71 0.00 0.00 7843.89 1780.87 9061.06 00:29:22.847 =================================================================================================================== 00:29:22.847 Total : 97455.74 95.17 0.00 0.00 7857.46 1780.87 11511.54' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:22.847 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:22.847 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:22.847 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:22.847 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:22.847 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:22.847 09:33:31 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 09:33:31.544369] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:25.386 [2024-07-15 09:33:31.544435] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249720 ] 00:29:25.386 Using job config with 4 jobs 00:29:25.386 [2024-07-15 09:33:31.695900] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:25.386 [2024-07-15 09:33:31.819280] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:25.386 cpumask for '\''job0'\'' is too big 00:29:25.386 cpumask for '\''job1'\'' is too big 00:29:25.386 cpumask for '\''job2'\'' is too big 00:29:25.386 cpumask for '\''job3'\'' is too big 00:29:25.386 Running I/O for 2 seconds... 00:29:25.386 00:29:25.386 Latency(us) 00:29:25.386 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.03 11967.83 11.69 0.00 0.00 21366.10 3789.69 33052.94 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.03 11956.68 11.68 0.00 0.00 21367.51 4616.01 33052.94 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11945.86 11.67 0.00 0.00 21309.04 3761.20 29177.77 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11934.73 11.66 0.00 0.00 21308.93 4616.01 29177.77 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11924.00 11.64 0.00 0.00 21251.32 3789.69 25416.57 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11913.01 11.63 0.00 0.00 21247.75 4616.01 25416.57 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.05 11995.75 11.71 0.00 0.00 21025.73 3618.73 21769.35 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.05 11984.69 11.70 0.00 0.00 21027.57 2835.14 21769.35 00:29:25.386 =================================================================================================================== 00:29:25.386 Total : 95622.55 93.38 0.00 0.00 21237.44 2835.14 33052.94' 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 09:33:31.544369] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:25.386 [2024-07-15 09:33:31.544435] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249720 ] 00:29:25.386 Using job config with 4 jobs 00:29:25.386 [2024-07-15 09:33:31.695900] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:25.386 [2024-07-15 09:33:31.819280] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:25.386 cpumask for '\''job0'\'' is too big 00:29:25.386 cpumask for '\''job1'\'' is too big 00:29:25.386 cpumask for '\''job2'\'' is too big 00:29:25.386 cpumask for '\''job3'\'' is too big 00:29:25.386 Running I/O for 2 seconds... 00:29:25.386 00:29:25.386 Latency(us) 00:29:25.386 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.03 11967.83 11.69 0.00 0.00 21366.10 3789.69 33052.94 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.03 11956.68 11.68 0.00 0.00 21367.51 4616.01 33052.94 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11945.86 11.67 0.00 0.00 21309.04 3761.20 29177.77 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11934.73 11.66 0.00 0.00 21308.93 4616.01 29177.77 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11924.00 11.64 0.00 0.00 21251.32 3789.69 25416.57 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11913.01 11.63 0.00 0.00 21247.75 4616.01 25416.57 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.05 11995.75 11.71 0.00 0.00 21025.73 3618.73 21769.35 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.05 11984.69 11.70 0.00 0.00 21027.57 2835.14 21769.35 00:29:25.386 =================================================================================================================== 00:29:25.386 Total : 95622.55 93.38 0.00 0.00 21237.44 2835.14 33052.94' 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 09:33:31.544369] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:25.386 [2024-07-15 09:33:31.544435] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid249720 ] 00:29:25.386 Using job config with 4 jobs 00:29:25.386 [2024-07-15 09:33:31.695900] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:25.386 [2024-07-15 09:33:31.819280] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:25.386 cpumask for '\''job0'\'' is too big 00:29:25.386 cpumask for '\''job1'\'' is too big 00:29:25.386 cpumask for '\''job2'\'' is too big 00:29:25.386 cpumask for '\''job3'\'' is too big 00:29:25.386 Running I/O for 2 seconds... 00:29:25.386 00:29:25.386 Latency(us) 00:29:25.386 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.03 11967.83 11.69 0.00 0.00 21366.10 3789.69 33052.94 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.03 11956.68 11.68 0.00 0.00 21367.51 4616.01 33052.94 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11945.86 11.67 0.00 0.00 21309.04 3761.20 29177.77 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11934.73 11.66 0.00 0.00 21308.93 4616.01 29177.77 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.04 11924.00 11.64 0.00 0.00 21251.32 3789.69 25416.57 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.04 11913.01 11.63 0.00 0.00 21247.75 4616.01 25416.57 00:29:25.386 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc0 : 2.05 11995.75 11.71 0.00 0.00 21025.73 3618.73 21769.35 00:29:25.386 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:25.386 Malloc1 : 2.05 11984.69 11.70 0.00 0.00 21027.57 2835.14 21769.35 00:29:25.386 =================================================================================================================== 00:29:25.386 Total : 95622.55 93.38 0.00 0.00 21237.44 2835.14 33052.94' 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:25.386 09:33:34 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:29:25.386 00:29:25.386 real 0m11.225s 00:29:25.386 user 0m9.881s 00:29:25.386 sys 0m1.184s 00:29:25.386 09:33:34 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:25.386 09:33:34 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:29:25.386 ************************************ 00:29:25.386 END TEST bdevperf_config 00:29:25.386 ************************************ 00:29:25.648 09:33:34 -- common/autotest_common.sh@1142 -- # return 0 00:29:25.648 09:33:34 -- spdk/autotest.sh@192 -- # uname -s 00:29:25.648 09:33:34 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:29:25.648 09:33:34 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:25.648 09:33:34 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:25.648 09:33:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:25.649 09:33:34 -- common/autotest_common.sh@10 -- # set +x 00:29:25.649 ************************************ 00:29:25.649 START TEST reactor_set_interrupt 00:29:25.649 ************************************ 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:25.649 * Looking for test storage... 00:29:25.649 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:25.649 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:25.649 09:33:34 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:25.649 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:25.649 09:33:34 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:25.650 09:33:34 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:25.650 09:33:34 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:25.650 #define SPDK_CONFIG_H 00:29:25.650 #define SPDK_CONFIG_APPS 1 00:29:25.650 #define SPDK_CONFIG_ARCH native 00:29:25.650 #undef SPDK_CONFIG_ASAN 00:29:25.650 #undef SPDK_CONFIG_AVAHI 00:29:25.650 #undef SPDK_CONFIG_CET 00:29:25.650 #define SPDK_CONFIG_COVERAGE 1 00:29:25.650 #define SPDK_CONFIG_CROSS_PREFIX 00:29:25.650 #define SPDK_CONFIG_CRYPTO 1 00:29:25.650 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:25.650 #undef SPDK_CONFIG_CUSTOMOCF 00:29:25.650 #undef SPDK_CONFIG_DAOS 00:29:25.650 #define SPDK_CONFIG_DAOS_DIR 00:29:25.650 #define SPDK_CONFIG_DEBUG 1 00:29:25.650 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:25.650 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:25.650 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:25.650 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:25.650 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:25.650 #undef SPDK_CONFIG_DPDK_UADK 00:29:25.650 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:25.650 #define SPDK_CONFIG_EXAMPLES 1 00:29:25.650 #undef SPDK_CONFIG_FC 00:29:25.650 #define SPDK_CONFIG_FC_PATH 00:29:25.650 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:25.650 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:25.650 #undef SPDK_CONFIG_FUSE 00:29:25.650 #undef SPDK_CONFIG_FUZZER 00:29:25.650 #define SPDK_CONFIG_FUZZER_LIB 00:29:25.650 #undef SPDK_CONFIG_GOLANG 00:29:25.650 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:25.650 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:25.650 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:25.650 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:25.650 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:25.650 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:25.650 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:25.650 #define SPDK_CONFIG_IDXD 1 00:29:25.650 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:25.650 #define SPDK_CONFIG_IPSEC_MB 1 00:29:25.650 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:25.650 #define SPDK_CONFIG_ISAL 1 00:29:25.650 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:25.650 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:25.650 #define SPDK_CONFIG_LIBDIR 00:29:25.650 #undef SPDK_CONFIG_LTO 00:29:25.650 #define SPDK_CONFIG_MAX_LCORES 128 00:29:25.650 #define SPDK_CONFIG_NVME_CUSE 1 00:29:25.650 #undef SPDK_CONFIG_OCF 00:29:25.650 #define SPDK_CONFIG_OCF_PATH 00:29:25.650 #define SPDK_CONFIG_OPENSSL_PATH 00:29:25.650 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:25.650 #define SPDK_CONFIG_PGO_DIR 00:29:25.650 #undef SPDK_CONFIG_PGO_USE 00:29:25.650 #define SPDK_CONFIG_PREFIX /usr/local 00:29:25.650 #undef SPDK_CONFIG_RAID5F 00:29:25.650 #undef SPDK_CONFIG_RBD 00:29:25.650 #define SPDK_CONFIG_RDMA 1 00:29:25.650 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:25.650 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:25.650 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:25.650 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:25.650 #define SPDK_CONFIG_SHARED 1 00:29:25.650 #undef SPDK_CONFIG_SMA 00:29:25.650 #define SPDK_CONFIG_TESTS 1 00:29:25.650 #undef SPDK_CONFIG_TSAN 00:29:25.650 #define SPDK_CONFIG_UBLK 1 00:29:25.650 #define SPDK_CONFIG_UBSAN 1 00:29:25.650 #undef SPDK_CONFIG_UNIT_TESTS 00:29:25.650 #undef SPDK_CONFIG_URING 00:29:25.650 #define SPDK_CONFIG_URING_PATH 00:29:25.650 #undef SPDK_CONFIG_URING_ZNS 00:29:25.650 #undef SPDK_CONFIG_USDT 00:29:25.650 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:25.650 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:25.650 #undef SPDK_CONFIG_VFIO_USER 00:29:25.650 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:25.650 #define SPDK_CONFIG_VHOST 1 00:29:25.650 #define SPDK_CONFIG_VIRTIO 1 00:29:25.650 #undef SPDK_CONFIG_VTUNE 00:29:25.650 #define SPDK_CONFIG_VTUNE_DIR 00:29:25.650 #define SPDK_CONFIG_WERROR 1 00:29:25.650 #define SPDK_CONFIG_WPDK_DIR 00:29:25.650 #undef SPDK_CONFIG_XNVME 00:29:25.650 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:25.650 09:33:34 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:25.650 09:33:34 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.650 09:33:34 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.650 09:33:34 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.650 09:33:34 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:29:25.650 09:33:34 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:25.650 09:33:34 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:25.650 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:25.651 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 250116 ]] 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 250116 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.TPrcoE 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.TPrcoE/tests/interrupt /tmp/spdk.TPrcoE 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88776376320 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5732139008 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892283904 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9420800 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253442560 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=815104 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:25.652 * Looking for test storage... 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.652 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88776376320 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7946731520 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.912 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:25.912 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:25.912 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=250221 00:29:25.913 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:25.913 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 250221 /var/tmp/spdk.sock 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 250221 ']' 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:25.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:25.913 09:33:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:25.913 09:33:34 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:25.913 [2024-07-15 09:33:34.653486] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:25.913 [2024-07-15 09:33:34.653552] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid250221 ] 00:29:25.913 [2024-07-15 09:33:34.785341] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:26.171 [2024-07-15 09:33:34.891237] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:26.171 [2024-07-15 09:33:34.891324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:26.171 [2024-07-15 09:33:34.891330] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.171 [2024-07-15 09:33:34.962407] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:27.110 09:33:35 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:27.110 09:33:35 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:27.110 09:33:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:29:27.110 09:33:35 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:27.369 Malloc0 00:29:27.369 Malloc1 00:29:27.369 Malloc2 00:29:27.369 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:29:27.369 09:33:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:27.369 09:33:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:27.369 09:33:36 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:27.369 5000+0 records in 00:29:27.369 5000+0 records out 00:29:27.369 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0374337 s, 274 MB/s 00:29:27.369 09:33:36 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:27.628 AIO0 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 250221 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 250221 without_thd 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=250221 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:27.628 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:27.887 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:28.145 spdk_thread ids are 1 on reactor0. 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 250221 0 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 250221 0 idle 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:28.145 09:33:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:28.146 09:33:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 250221 1 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 250221 1 idle 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250272 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250272 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 250221 2 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 250221 2 idle 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:28.405 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250273 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250273 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:28.664 09:33:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:29:28.665 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:29:28.924 [2024-07-15 09:33:37.740296] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:28.924 09:33:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:29.184 [2024-07-15 09:33:37.988016] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:29.184 [2024-07-15 09:33:37.988413] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:29.184 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:29.443 [2024-07-15 09:33:38.231821] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:29.443 [2024-07-15 09:33:38.232008] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 250221 0 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 250221 0 busy 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:29.443 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250221 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250221 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 250221 2 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 250221 2 busy 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250273 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250273 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:29.703 09:33:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:30.272 [2024-07-15 09:33:39.095817] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:30.272 [2024-07-15 09:33:39.095932] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 250221 2 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 250221 2 idle 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:30.272 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250273 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.86 reactor_2' 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250273 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.86 reactor_2 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:30.531 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:30.790 [2024-07-15 09:33:39.531813] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:30.790 [2024-07-15 09:33:39.531921] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:30.790 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:29:30.790 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:29:30.790 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:29:31.050 [2024-07-15 09:33:39.768221] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 250221 0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 250221 0 idle 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=250221 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 250221 -w 256 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 250221 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.95 reactor_0' 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 250221 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.95 reactor_0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:29:31.050 09:33:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 250221 00:29:31.050 09:33:39 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 250221 ']' 00:29:31.050 09:33:39 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 250221 00:29:31.050 09:33:39 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:31.050 09:33:39 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:31.050 09:33:39 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 250221 00:29:31.309 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:31.309 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:31.309 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 250221' 00:29:31.309 killing process with pid 250221 00:29:31.309 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 250221 00:29:31.309 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 250221 00:29:31.309 09:33:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:29:31.309 09:33:40 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=251066 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:31.569 09:33:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 251066 /var/tmp/spdk.sock 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 251066 ']' 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:31.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:31.569 09:33:40 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:31.569 [2024-07-15 09:33:40.310315] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:31.569 [2024-07-15 09:33:40.310387] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid251066 ] 00:29:31.569 [2024-07-15 09:33:40.438070] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:31.828 [2024-07-15 09:33:40.537106] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:31.828 [2024-07-15 09:33:40.537190] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:31.828 [2024-07-15 09:33:40.537194] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.828 [2024-07-15 09:33:40.610889] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:32.396 09:33:41 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:32.396 09:33:41 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:32.396 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:29:32.396 09:33:41 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:32.656 Malloc0 00:29:32.656 Malloc1 00:29:32.656 Malloc2 00:29:32.656 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:29:32.656 09:33:41 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:32.656 09:33:41 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:32.656 09:33:41 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:32.656 5000+0 records in 00:29:32.656 5000+0 records out 00:29:32.656 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0247896 s, 413 MB/s 00:29:32.656 09:33:41 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:32.945 AIO0 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 251066 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 251066 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=251066 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:32.945 09:33:41 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:33.209 09:33:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:33.469 spdk_thread ids are 1 on reactor0. 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 251066 0 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 251066 0 idle 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:33.469 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:33.728 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251066 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.38 reactor_0' 00:29:33.728 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251066 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.38 reactor_0 00:29:33.728 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:33.728 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 251066 1 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 251066 1 idle 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251091 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251091 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 251066 2 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 251066 2 idle 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:33.729 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251092 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251092 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:29:33.988 09:33:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:34.247 [2024-07-15 09:33:43.061860] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:34.247 [2024-07-15 09:33:43.062103] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:29:34.247 [2024-07-15 09:33:43.062231] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:34.247 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:34.505 [2024-07-15 09:33:43.310323] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:34.505 [2024-07-15 09:33:43.310477] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 251066 0 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 251066 0 busy 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:34.505 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251066 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.82 reactor_0' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251066 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.82 reactor_0 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 251066 2 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 251066 2 busy 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251092 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251092 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:34.764 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:35.023 [2024-07-15 09:33:43.920033] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:35.023 [2024-07-15 09:33:43.920149] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 251066 2 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 251066 2 idle 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:35.023 09:33:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251092 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2' 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251092 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:35.280 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:35.539 [2024-07-15 09:33:44.284978] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:35.539 [2024-07-15 09:33:44.285116] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:29:35.539 [2024-07-15 09:33:44.285142] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 251066 0 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 251066 0 idle 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=251066 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 251066 -w 256 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:35.539 09:33:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 251066 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.61 reactor_0' 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 251066 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.61 reactor_0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:29:35.540 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 251066 00:29:35.540 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 251066 ']' 00:29:35.540 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 251066 00:29:35.540 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 251066 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 251066' 00:29:35.799 killing process with pid 251066 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 251066 00:29:35.799 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 251066 00:29:36.058 09:33:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:29:36.058 09:33:44 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:36.058 00:29:36.058 real 0m10.411s 00:29:36.058 user 0m10.338s 00:29:36.058 sys 0m2.114s 00:29:36.058 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:36.058 09:33:44 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:36.058 ************************************ 00:29:36.058 END TEST reactor_set_interrupt 00:29:36.058 ************************************ 00:29:36.058 09:33:44 -- common/autotest_common.sh@1142 -- # return 0 00:29:36.058 09:33:44 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:36.058 09:33:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:36.058 09:33:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:36.058 09:33:44 -- common/autotest_common.sh@10 -- # set +x 00:29:36.058 ************************************ 00:29:36.058 START TEST reap_unregistered_poller 00:29:36.058 ************************************ 00:29:36.058 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:36.058 * Looking for test storage... 00:29:36.058 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:36.058 09:33:44 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:36.058 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:36.058 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:29:36.058 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:36.058 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:36.059 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:36.059 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:36.059 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:36.059 09:33:44 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:36.059 09:33:44 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:36.059 09:33:45 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:36.059 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:36.059 09:33:45 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:36.321 #define SPDK_CONFIG_H 00:29:36.321 #define SPDK_CONFIG_APPS 1 00:29:36.321 #define SPDK_CONFIG_ARCH native 00:29:36.321 #undef SPDK_CONFIG_ASAN 00:29:36.321 #undef SPDK_CONFIG_AVAHI 00:29:36.321 #undef SPDK_CONFIG_CET 00:29:36.321 #define SPDK_CONFIG_COVERAGE 1 00:29:36.321 #define SPDK_CONFIG_CROSS_PREFIX 00:29:36.321 #define SPDK_CONFIG_CRYPTO 1 00:29:36.321 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:36.321 #undef SPDK_CONFIG_CUSTOMOCF 00:29:36.321 #undef SPDK_CONFIG_DAOS 00:29:36.321 #define SPDK_CONFIG_DAOS_DIR 00:29:36.321 #define SPDK_CONFIG_DEBUG 1 00:29:36.321 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:36.321 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:36.321 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:36.321 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:36.321 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:36.321 #undef SPDK_CONFIG_DPDK_UADK 00:29:36.321 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:36.321 #define SPDK_CONFIG_EXAMPLES 1 00:29:36.321 #undef SPDK_CONFIG_FC 00:29:36.321 #define SPDK_CONFIG_FC_PATH 00:29:36.321 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:36.321 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:36.321 #undef SPDK_CONFIG_FUSE 00:29:36.321 #undef SPDK_CONFIG_FUZZER 00:29:36.321 #define SPDK_CONFIG_FUZZER_LIB 00:29:36.321 #undef SPDK_CONFIG_GOLANG 00:29:36.321 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:36.321 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:36.321 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:36.321 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:36.321 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:36.321 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:36.321 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:36.321 #define SPDK_CONFIG_IDXD 1 00:29:36.321 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:36.321 #define SPDK_CONFIG_IPSEC_MB 1 00:29:36.321 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:36.321 #define SPDK_CONFIG_ISAL 1 00:29:36.321 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:36.321 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:36.321 #define SPDK_CONFIG_LIBDIR 00:29:36.321 #undef SPDK_CONFIG_LTO 00:29:36.321 #define SPDK_CONFIG_MAX_LCORES 128 00:29:36.321 #define SPDK_CONFIG_NVME_CUSE 1 00:29:36.321 #undef SPDK_CONFIG_OCF 00:29:36.321 #define SPDK_CONFIG_OCF_PATH 00:29:36.321 #define SPDK_CONFIG_OPENSSL_PATH 00:29:36.321 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:36.321 #define SPDK_CONFIG_PGO_DIR 00:29:36.321 #undef SPDK_CONFIG_PGO_USE 00:29:36.321 #define SPDK_CONFIG_PREFIX /usr/local 00:29:36.321 #undef SPDK_CONFIG_RAID5F 00:29:36.321 #undef SPDK_CONFIG_RBD 00:29:36.321 #define SPDK_CONFIG_RDMA 1 00:29:36.321 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:36.321 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:36.321 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:36.321 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:36.321 #define SPDK_CONFIG_SHARED 1 00:29:36.321 #undef SPDK_CONFIG_SMA 00:29:36.321 #define SPDK_CONFIG_TESTS 1 00:29:36.321 #undef SPDK_CONFIG_TSAN 00:29:36.321 #define SPDK_CONFIG_UBLK 1 00:29:36.321 #define SPDK_CONFIG_UBSAN 1 00:29:36.321 #undef SPDK_CONFIG_UNIT_TESTS 00:29:36.321 #undef SPDK_CONFIG_URING 00:29:36.321 #define SPDK_CONFIG_URING_PATH 00:29:36.321 #undef SPDK_CONFIG_URING_ZNS 00:29:36.321 #undef SPDK_CONFIG_USDT 00:29:36.321 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:36.321 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:36.321 #undef SPDK_CONFIG_VFIO_USER 00:29:36.321 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:36.321 #define SPDK_CONFIG_VHOST 1 00:29:36.321 #define SPDK_CONFIG_VIRTIO 1 00:29:36.321 #undef SPDK_CONFIG_VTUNE 00:29:36.321 #define SPDK_CONFIG_VTUNE_DIR 00:29:36.321 #define SPDK_CONFIG_WERROR 1 00:29:36.321 #define SPDK_CONFIG_WPDK_DIR 00:29:36.321 #undef SPDK_CONFIG_XNVME 00:29:36.321 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:36.321 09:33:45 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:36.321 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:36.321 09:33:45 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:36.321 09:33:45 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:36.321 09:33:45 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:36.321 09:33:45 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:36.321 09:33:45 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:36.321 09:33:45 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:36.321 09:33:45 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:29:36.321 09:33:45 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:36.321 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:36.321 09:33:45 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:36.322 09:33:45 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:36.322 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 251732 ]] 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 251732 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.vyubBy 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.vyubBy/tests/interrupt /tmp/spdk.vyubBy 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88776204288 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5732311040 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892283904 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9420800 00:29:36.323 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253442560 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=815104 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:36.324 * Looking for test storage... 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88776204288 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7946903552 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.324 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=251773 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 251773 /var/tmp/spdk.sock 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 251773 ']' 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:36.324 09:33:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:36.324 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:36.324 09:33:45 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:36.324 [2024-07-15 09:33:45.154826] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:36.324 [2024-07-15 09:33:45.154892] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid251773 ] 00:29:36.584 [2024-07-15 09:33:45.282805] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:36.584 [2024-07-15 09:33:45.389627] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:36.584 [2024-07-15 09:33:45.389726] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:36.584 [2024-07-15 09:33:45.389740] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:36.584 [2024-07-15 09:33:45.471666] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:37.522 09:33:46 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:37.522 09:33:46 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:29:37.522 09:33:46 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:37.522 09:33:46 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:37.522 09:33:46 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:29:37.522 "name": "app_thread", 00:29:37.522 "id": 1, 00:29:37.522 "active_pollers": [], 00:29:37.522 "timed_pollers": [ 00:29:37.522 { 00:29:37.522 "name": "rpc_subsystem_poll_servers", 00:29:37.522 "id": 1, 00:29:37.522 "state": "waiting", 00:29:37.522 "run_count": 0, 00:29:37.522 "busy_count": 0, 00:29:37.522 "period_ticks": 9200000 00:29:37.522 } 00:29:37.522 ], 00:29:37.522 "paused_pollers": [] 00:29:37.522 }' 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:29:37.522 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:37.781 5000+0 records in 00:29:37.781 5000+0 records out 00:29:37.781 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0256889 s, 399 MB/s 00:29:37.781 09:33:46 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:38.039 AIO0 00:29:38.039 09:33:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:29:38.608 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.608 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:38.608 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:29:38.608 "name": "app_thread", 00:29:38.608 "id": 1, 00:29:38.608 "active_pollers": [], 00:29:38.608 "timed_pollers": [ 00:29:38.608 { 00:29:38.608 "name": "rpc_subsystem_poll_servers", 00:29:38.608 "id": 1, 00:29:38.608 "state": "waiting", 00:29:38.608 "run_count": 0, 00:29:38.608 "busy_count": 0, 00:29:38.608 "period_ticks": 9200000 00:29:38.608 } 00:29:38.608 ], 00:29:38.608 "paused_pollers": [] 00:29:38.608 }' 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:29:38.608 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:29:38.867 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:29:38.867 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:29:38.867 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:29:38.867 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 251773 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 251773 ']' 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 251773 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 251773 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:38.867 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:38.868 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 251773' 00:29:38.868 killing process with pid 251773 00:29:38.868 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 251773 00:29:38.868 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 251773 00:29:39.128 09:33:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:29:39.128 09:33:47 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:39.128 00:29:39.128 real 0m2.954s 00:29:39.128 user 0m2.012s 00:29:39.128 sys 0m0.714s 00:29:39.128 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:39.128 09:33:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:39.128 ************************************ 00:29:39.128 END TEST reap_unregistered_poller 00:29:39.128 ************************************ 00:29:39.128 09:33:47 -- common/autotest_common.sh@1142 -- # return 0 00:29:39.128 09:33:47 -- spdk/autotest.sh@198 -- # uname -s 00:29:39.128 09:33:47 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:29:39.128 09:33:47 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:29:39.128 09:33:47 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:29:39.128 09:33:47 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@260 -- # timing_exit lib 00:29:39.128 09:33:47 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:39.128 09:33:47 -- common/autotest_common.sh@10 -- # set +x 00:29:39.128 09:33:47 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:29:39.128 09:33:47 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:39.128 09:33:47 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:39.128 09:33:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:39.128 09:33:47 -- common/autotest_common.sh@10 -- # set +x 00:29:39.128 ************************************ 00:29:39.128 START TEST compress_compdev 00:29:39.128 ************************************ 00:29:39.128 09:33:47 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:39.388 * Looking for test storage... 00:29:39.388 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:39.388 09:33:48 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:39.388 09:33:48 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:39.388 09:33:48 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:39.388 09:33:48 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:39.388 09:33:48 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:39.388 09:33:48 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:39.388 09:33:48 compress_compdev -- paths/export.sh@5 -- # export PATH 00:29:39.388 09:33:48 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:39.388 09:33:48 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=252220 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:39.388 09:33:48 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 252220 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 252220 ']' 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:39.388 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:39.388 09:33:48 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:39.388 [2024-07-15 09:33:48.183692] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:39.388 [2024-07-15 09:33:48.183759] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid252220 ] 00:29:39.388 [2024-07-15 09:33:48.302757] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:39.648 [2024-07-15 09:33:48.409504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:39.648 [2024-07-15 09:33:48.409511] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:40.586 [2024-07-15 09:33:49.173611] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:40.586 09:33:49 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:40.586 09:33:49 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:40.586 09:33:49 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:29:40.586 09:33:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:40.586 09:33:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:41.155 [2024-07-15 09:33:49.822937] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15d93c0 PMD being used: compress_qat 00:29:41.155 09:33:49 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:41.155 09:33:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:41.155 09:33:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:41.155 09:33:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:41.155 09:33:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:41.156 09:33:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:41.156 09:33:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:41.156 09:33:50 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:41.416 [ 00:29:41.416 { 00:29:41.416 "name": "Nvme0n1", 00:29:41.416 "aliases": [ 00:29:41.416 "01000000-0000-0000-5cd2-e43197705251" 00:29:41.416 ], 00:29:41.416 "product_name": "NVMe disk", 00:29:41.416 "block_size": 512, 00:29:41.416 "num_blocks": 15002931888, 00:29:41.416 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:41.416 "assigned_rate_limits": { 00:29:41.416 "rw_ios_per_sec": 0, 00:29:41.416 "rw_mbytes_per_sec": 0, 00:29:41.416 "r_mbytes_per_sec": 0, 00:29:41.416 "w_mbytes_per_sec": 0 00:29:41.416 }, 00:29:41.416 "claimed": false, 00:29:41.416 "zoned": false, 00:29:41.416 "supported_io_types": { 00:29:41.416 "read": true, 00:29:41.416 "write": true, 00:29:41.416 "unmap": true, 00:29:41.416 "flush": true, 00:29:41.416 "reset": true, 00:29:41.416 "nvme_admin": true, 00:29:41.416 "nvme_io": true, 00:29:41.416 "nvme_io_md": false, 00:29:41.416 "write_zeroes": true, 00:29:41.416 "zcopy": false, 00:29:41.416 "get_zone_info": false, 00:29:41.416 "zone_management": false, 00:29:41.416 "zone_append": false, 00:29:41.416 "compare": false, 00:29:41.416 "compare_and_write": false, 00:29:41.416 "abort": true, 00:29:41.416 "seek_hole": false, 00:29:41.416 "seek_data": false, 00:29:41.416 "copy": false, 00:29:41.416 "nvme_iov_md": false 00:29:41.416 }, 00:29:41.416 "driver_specific": { 00:29:41.416 "nvme": [ 00:29:41.416 { 00:29:41.416 "pci_address": "0000:5e:00.0", 00:29:41.416 "trid": { 00:29:41.416 "trtype": "PCIe", 00:29:41.416 "traddr": "0000:5e:00.0" 00:29:41.416 }, 00:29:41.416 "ctrlr_data": { 00:29:41.416 "cntlid": 0, 00:29:41.416 "vendor_id": "0x8086", 00:29:41.416 "model_number": "INTEL SSDPF2KX076TZO", 00:29:41.416 "serial_number": "PHAC0301002G7P6CGN", 00:29:41.416 "firmware_revision": "JCV10200", 00:29:41.416 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:41.416 "oacs": { 00:29:41.416 "security": 1, 00:29:41.416 "format": 1, 00:29:41.416 "firmware": 1, 00:29:41.416 "ns_manage": 1 00:29:41.416 }, 00:29:41.416 "multi_ctrlr": false, 00:29:41.416 "ana_reporting": false 00:29:41.416 }, 00:29:41.416 "vs": { 00:29:41.416 "nvme_version": "1.3" 00:29:41.416 }, 00:29:41.416 "ns_data": { 00:29:41.416 "id": 1, 00:29:41.416 "can_share": false 00:29:41.416 }, 00:29:41.416 "security": { 00:29:41.416 "opal": true 00:29:41.416 } 00:29:41.416 } 00:29:41.416 ], 00:29:41.416 "mp_policy": "active_passive" 00:29:41.416 } 00:29:41.416 } 00:29:41.416 ] 00:29:41.416 09:33:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:41.416 09:33:50 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:41.675 [2024-07-15 09:33:50.572532] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x143e0d0 PMD being used: compress_qat 00:29:44.211 47266773-c6d9-40e6-9293-ee53e60578bd 00:29:44.211 09:33:52 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:44.211 7e0bd51e-1ff3-41ef-8174-7a70784334bc 00:29:44.211 09:33:53 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:44.211 09:33:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:44.469 09:33:53 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:44.727 [ 00:29:44.727 { 00:29:44.727 "name": "7e0bd51e-1ff3-41ef-8174-7a70784334bc", 00:29:44.727 "aliases": [ 00:29:44.727 "lvs0/lv0" 00:29:44.727 ], 00:29:44.727 "product_name": "Logical Volume", 00:29:44.727 "block_size": 512, 00:29:44.727 "num_blocks": 204800, 00:29:44.727 "uuid": "7e0bd51e-1ff3-41ef-8174-7a70784334bc", 00:29:44.727 "assigned_rate_limits": { 00:29:44.727 "rw_ios_per_sec": 0, 00:29:44.727 "rw_mbytes_per_sec": 0, 00:29:44.727 "r_mbytes_per_sec": 0, 00:29:44.727 "w_mbytes_per_sec": 0 00:29:44.727 }, 00:29:44.727 "claimed": false, 00:29:44.727 "zoned": false, 00:29:44.727 "supported_io_types": { 00:29:44.727 "read": true, 00:29:44.727 "write": true, 00:29:44.727 "unmap": true, 00:29:44.727 "flush": false, 00:29:44.727 "reset": true, 00:29:44.727 "nvme_admin": false, 00:29:44.727 "nvme_io": false, 00:29:44.727 "nvme_io_md": false, 00:29:44.727 "write_zeroes": true, 00:29:44.727 "zcopy": false, 00:29:44.727 "get_zone_info": false, 00:29:44.727 "zone_management": false, 00:29:44.727 "zone_append": false, 00:29:44.727 "compare": false, 00:29:44.727 "compare_and_write": false, 00:29:44.727 "abort": false, 00:29:44.727 "seek_hole": true, 00:29:44.727 "seek_data": true, 00:29:44.727 "copy": false, 00:29:44.727 "nvme_iov_md": false 00:29:44.727 }, 00:29:44.727 "driver_specific": { 00:29:44.727 "lvol": { 00:29:44.727 "lvol_store_uuid": "47266773-c6d9-40e6-9293-ee53e60578bd", 00:29:44.727 "base_bdev": "Nvme0n1", 00:29:44.727 "thin_provision": true, 00:29:44.727 "num_allocated_clusters": 0, 00:29:44.727 "snapshot": false, 00:29:44.727 "clone": false, 00:29:44.727 "esnap_clone": false 00:29:44.727 } 00:29:44.727 } 00:29:44.727 } 00:29:44.727 ] 00:29:44.727 09:33:53 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:44.727 09:33:53 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:44.727 09:33:53 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:44.985 [2024-07-15 09:33:53.795261] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:44.985 COMP_lvs0/lv0 00:29:44.985 09:33:53 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:44.985 09:33:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:44.985 09:33:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:44.985 09:33:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:44.985 09:33:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:44.986 09:33:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:44.986 09:33:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:45.244 09:33:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:45.503 [ 00:29:45.503 { 00:29:45.503 "name": "COMP_lvs0/lv0", 00:29:45.503 "aliases": [ 00:29:45.503 "7c57e6cb-9edc-51b1-83b7-fcfb5025dd8e" 00:29:45.503 ], 00:29:45.503 "product_name": "compress", 00:29:45.503 "block_size": 512, 00:29:45.503 "num_blocks": 200704, 00:29:45.503 "uuid": "7c57e6cb-9edc-51b1-83b7-fcfb5025dd8e", 00:29:45.503 "assigned_rate_limits": { 00:29:45.503 "rw_ios_per_sec": 0, 00:29:45.503 "rw_mbytes_per_sec": 0, 00:29:45.503 "r_mbytes_per_sec": 0, 00:29:45.503 "w_mbytes_per_sec": 0 00:29:45.503 }, 00:29:45.503 "claimed": false, 00:29:45.503 "zoned": false, 00:29:45.503 "supported_io_types": { 00:29:45.503 "read": true, 00:29:45.503 "write": true, 00:29:45.503 "unmap": false, 00:29:45.503 "flush": false, 00:29:45.503 "reset": false, 00:29:45.503 "nvme_admin": false, 00:29:45.503 "nvme_io": false, 00:29:45.503 "nvme_io_md": false, 00:29:45.503 "write_zeroes": true, 00:29:45.503 "zcopy": false, 00:29:45.503 "get_zone_info": false, 00:29:45.503 "zone_management": false, 00:29:45.503 "zone_append": false, 00:29:45.503 "compare": false, 00:29:45.503 "compare_and_write": false, 00:29:45.503 "abort": false, 00:29:45.503 "seek_hole": false, 00:29:45.503 "seek_data": false, 00:29:45.503 "copy": false, 00:29:45.503 "nvme_iov_md": false 00:29:45.503 }, 00:29:45.503 "driver_specific": { 00:29:45.503 "compress": { 00:29:45.503 "name": "COMP_lvs0/lv0", 00:29:45.503 "base_bdev_name": "7e0bd51e-1ff3-41ef-8174-7a70784334bc" 00:29:45.503 } 00:29:45.503 } 00:29:45.503 } 00:29:45.503 ] 00:29:45.503 09:33:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:45.503 09:33:54 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:45.503 [2024-07-15 09:33:54.437820] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa5941b15c0 PMD being used: compress_qat 00:29:45.503 [2024-07-15 09:33:54.440083] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15d6670 PMD being used: compress_qat 00:29:45.503 Running I/O for 3 seconds... 00:29:48.788 00:29:48.788 Latency(us) 00:29:48.788 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:48.788 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:48.788 Verification LBA range: start 0x0 length 0x3100 00:29:48.788 COMP_lvs0/lv0 : 3.00 5125.40 20.02 0.00 0.00 6190.86 477.27 5983.72 00:29:48.788 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:48.788 Verification LBA range: start 0x3100 length 0x3100 00:29:48.788 COMP_lvs0/lv0 : 3.00 5407.30 21.12 0.00 0.00 5880.61 400.70 5727.28 00:29:48.788 =================================================================================================================== 00:29:48.788 Total : 10532.69 41.14 0.00 0.00 6031.58 400.70 5983.72 00:29:48.788 0 00:29:48.788 09:33:57 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:48.788 09:33:57 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:48.788 09:33:57 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:49.047 09:33:57 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:49.047 09:33:57 compress_compdev -- compress/compress.sh@78 -- # killprocess 252220 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 252220 ']' 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 252220 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 252220 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 252220' 00:29:49.047 killing process with pid 252220 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@967 -- # kill 252220 00:29:49.047 Received shutdown signal, test time was about 3.000000 seconds 00:29:49.047 00:29:49.047 Latency(us) 00:29:49.047 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.047 =================================================================================================================== 00:29:49.047 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:49.047 09:33:57 compress_compdev -- common/autotest_common.sh@972 -- # wait 252220 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=253908 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:52.376 09:34:00 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 253908 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 253908 ']' 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:52.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:52.376 09:34:00 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:52.376 [2024-07-15 09:34:01.041444] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:29:52.376 [2024-07-15 09:34:01.041516] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid253908 ] 00:29:52.376 [2024-07-15 09:34:01.161087] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:52.376 [2024-07-15 09:34:01.267797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:52.376 [2024-07-15 09:34:01.267803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:53.311 [2024-07-15 09:34:02.014748] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:53.311 09:34:02 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:53.311 09:34:02 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:53.311 09:34:02 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:29:53.311 09:34:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:53.311 09:34:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:53.879 [2024-07-15 09:34:02.655113] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x109e3c0 PMD being used: compress_qat 00:29:53.879 09:34:02 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:53.879 09:34:02 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:54.137 09:34:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:54.394 [ 00:29:54.395 { 00:29:54.395 "name": "Nvme0n1", 00:29:54.395 "aliases": [ 00:29:54.395 "01000000-0000-0000-5cd2-e43197705251" 00:29:54.395 ], 00:29:54.395 "product_name": "NVMe disk", 00:29:54.395 "block_size": 512, 00:29:54.395 "num_blocks": 15002931888, 00:29:54.395 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:54.395 "assigned_rate_limits": { 00:29:54.395 "rw_ios_per_sec": 0, 00:29:54.395 "rw_mbytes_per_sec": 0, 00:29:54.395 "r_mbytes_per_sec": 0, 00:29:54.395 "w_mbytes_per_sec": 0 00:29:54.395 }, 00:29:54.395 "claimed": false, 00:29:54.395 "zoned": false, 00:29:54.395 "supported_io_types": { 00:29:54.395 "read": true, 00:29:54.395 "write": true, 00:29:54.395 "unmap": true, 00:29:54.395 "flush": true, 00:29:54.395 "reset": true, 00:29:54.395 "nvme_admin": true, 00:29:54.395 "nvme_io": true, 00:29:54.395 "nvme_io_md": false, 00:29:54.395 "write_zeroes": true, 00:29:54.395 "zcopy": false, 00:29:54.395 "get_zone_info": false, 00:29:54.395 "zone_management": false, 00:29:54.395 "zone_append": false, 00:29:54.395 "compare": false, 00:29:54.395 "compare_and_write": false, 00:29:54.395 "abort": true, 00:29:54.395 "seek_hole": false, 00:29:54.395 "seek_data": false, 00:29:54.395 "copy": false, 00:29:54.395 "nvme_iov_md": false 00:29:54.395 }, 00:29:54.395 "driver_specific": { 00:29:54.395 "nvme": [ 00:29:54.395 { 00:29:54.395 "pci_address": "0000:5e:00.0", 00:29:54.395 "trid": { 00:29:54.395 "trtype": "PCIe", 00:29:54.395 "traddr": "0000:5e:00.0" 00:29:54.395 }, 00:29:54.395 "ctrlr_data": { 00:29:54.395 "cntlid": 0, 00:29:54.395 "vendor_id": "0x8086", 00:29:54.395 "model_number": "INTEL SSDPF2KX076TZO", 00:29:54.395 "serial_number": "PHAC0301002G7P6CGN", 00:29:54.395 "firmware_revision": "JCV10200", 00:29:54.395 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:54.395 "oacs": { 00:29:54.395 "security": 1, 00:29:54.395 "format": 1, 00:29:54.395 "firmware": 1, 00:29:54.395 "ns_manage": 1 00:29:54.395 }, 00:29:54.395 "multi_ctrlr": false, 00:29:54.395 "ana_reporting": false 00:29:54.395 }, 00:29:54.395 "vs": { 00:29:54.395 "nvme_version": "1.3" 00:29:54.395 }, 00:29:54.395 "ns_data": { 00:29:54.395 "id": 1, 00:29:54.395 "can_share": false 00:29:54.395 }, 00:29:54.395 "security": { 00:29:54.395 "opal": true 00:29:54.395 } 00:29:54.395 } 00:29:54.395 ], 00:29:54.395 "mp_policy": "active_passive" 00:29:54.395 } 00:29:54.395 } 00:29:54.395 ] 00:29:54.395 09:34:03 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:54.395 09:34:03 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:54.652 [2024-07-15 09:34:03.424982] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf03660 PMD being used: compress_qat 00:29:57.186 9d9afa0b-29bf-4a16-b19e-3aa3b9b85a6e 00:29:57.186 09:34:05 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:57.186 56d25599-9d30-4698-8b3e-ac16f8953b7f 00:29:57.186 09:34:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:57.186 09:34:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:57.446 09:34:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:57.446 [ 00:29:57.446 { 00:29:57.446 "name": "56d25599-9d30-4698-8b3e-ac16f8953b7f", 00:29:57.446 "aliases": [ 00:29:57.446 "lvs0/lv0" 00:29:57.446 ], 00:29:57.446 "product_name": "Logical Volume", 00:29:57.446 "block_size": 512, 00:29:57.446 "num_blocks": 204800, 00:29:57.446 "uuid": "56d25599-9d30-4698-8b3e-ac16f8953b7f", 00:29:57.446 "assigned_rate_limits": { 00:29:57.446 "rw_ios_per_sec": 0, 00:29:57.446 "rw_mbytes_per_sec": 0, 00:29:57.446 "r_mbytes_per_sec": 0, 00:29:57.446 "w_mbytes_per_sec": 0 00:29:57.446 }, 00:29:57.446 "claimed": false, 00:29:57.446 "zoned": false, 00:29:57.446 "supported_io_types": { 00:29:57.446 "read": true, 00:29:57.446 "write": true, 00:29:57.446 "unmap": true, 00:29:57.446 "flush": false, 00:29:57.446 "reset": true, 00:29:57.446 "nvme_admin": false, 00:29:57.446 "nvme_io": false, 00:29:57.446 "nvme_io_md": false, 00:29:57.446 "write_zeroes": true, 00:29:57.446 "zcopy": false, 00:29:57.446 "get_zone_info": false, 00:29:57.446 "zone_management": false, 00:29:57.446 "zone_append": false, 00:29:57.446 "compare": false, 00:29:57.446 "compare_and_write": false, 00:29:57.446 "abort": false, 00:29:57.446 "seek_hole": true, 00:29:57.446 "seek_data": true, 00:29:57.446 "copy": false, 00:29:57.446 "nvme_iov_md": false 00:29:57.446 }, 00:29:57.446 "driver_specific": { 00:29:57.446 "lvol": { 00:29:57.446 "lvol_store_uuid": "9d9afa0b-29bf-4a16-b19e-3aa3b9b85a6e", 00:29:57.446 "base_bdev": "Nvme0n1", 00:29:57.446 "thin_provision": true, 00:29:57.446 "num_allocated_clusters": 0, 00:29:57.446 "snapshot": false, 00:29:57.446 "clone": false, 00:29:57.446 "esnap_clone": false 00:29:57.446 } 00:29:57.446 } 00:29:57.446 } 00:29:57.446 ] 00:29:57.446 09:34:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:57.446 09:34:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:57.446 09:34:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:57.705 [2024-07-15 09:34:06.619732] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:57.705 COMP_lvs0/lv0 00:29:57.705 09:34:06 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:57.705 09:34:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:57.964 09:34:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:58.224 [ 00:29:58.224 { 00:29:58.224 "name": "COMP_lvs0/lv0", 00:29:58.224 "aliases": [ 00:29:58.224 "3b781142-2768-5dc7-a2b0-ca746a1f300e" 00:29:58.224 ], 00:29:58.224 "product_name": "compress", 00:29:58.224 "block_size": 512, 00:29:58.224 "num_blocks": 200704, 00:29:58.224 "uuid": "3b781142-2768-5dc7-a2b0-ca746a1f300e", 00:29:58.224 "assigned_rate_limits": { 00:29:58.224 "rw_ios_per_sec": 0, 00:29:58.224 "rw_mbytes_per_sec": 0, 00:29:58.224 "r_mbytes_per_sec": 0, 00:29:58.224 "w_mbytes_per_sec": 0 00:29:58.224 }, 00:29:58.224 "claimed": false, 00:29:58.224 "zoned": false, 00:29:58.224 "supported_io_types": { 00:29:58.224 "read": true, 00:29:58.224 "write": true, 00:29:58.224 "unmap": false, 00:29:58.224 "flush": false, 00:29:58.224 "reset": false, 00:29:58.224 "nvme_admin": false, 00:29:58.224 "nvme_io": false, 00:29:58.224 "nvme_io_md": false, 00:29:58.224 "write_zeroes": true, 00:29:58.224 "zcopy": false, 00:29:58.224 "get_zone_info": false, 00:29:58.224 "zone_management": false, 00:29:58.224 "zone_append": false, 00:29:58.224 "compare": false, 00:29:58.224 "compare_and_write": false, 00:29:58.224 "abort": false, 00:29:58.224 "seek_hole": false, 00:29:58.224 "seek_data": false, 00:29:58.224 "copy": false, 00:29:58.224 "nvme_iov_md": false 00:29:58.224 }, 00:29:58.224 "driver_specific": { 00:29:58.224 "compress": { 00:29:58.224 "name": "COMP_lvs0/lv0", 00:29:58.224 "base_bdev_name": "56d25599-9d30-4698-8b3e-ac16f8953b7f" 00:29:58.224 } 00:29:58.224 } 00:29:58.224 } 00:29:58.224 ] 00:29:58.224 09:34:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:58.224 09:34:07 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:58.483 [2024-07-15 09:34:07.261881] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f0d601b15c0 PMD being used: compress_qat 00:29:58.483 [2024-07-15 09:34:07.264097] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x109b770 PMD being used: compress_qat 00:29:58.483 Running I/O for 3 seconds... 00:30:01.771 00:30:01.771 Latency(us) 00:30:01.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:01.771 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:01.771 Verification LBA range: start 0x0 length 0x3100 00:30:01.771 COMP_lvs0/lv0 : 3.00 5140.45 20.08 0.00 0.00 6174.02 429.19 5584.81 00:30:01.771 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:01.771 Verification LBA range: start 0x3100 length 0x3100 00:30:01.771 COMP_lvs0/lv0 : 3.00 5413.80 21.15 0.00 0.00 5874.64 320.56 5527.82 00:30:01.771 =================================================================================================================== 00:30:01.771 Total : 10554.25 41.23 0.00 0.00 6020.46 320.56 5584.81 00:30:01.771 0 00:30:01.771 09:34:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:01.771 09:34:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:01.771 09:34:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:02.032 09:34:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:02.032 09:34:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 253908 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 253908 ']' 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 253908 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 253908 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 253908' 00:30:02.032 killing process with pid 253908 00:30:02.032 09:34:10 compress_compdev -- common/autotest_common.sh@967 -- # kill 253908 00:30:02.032 Received shutdown signal, test time was about 3.000000 seconds 00:30:02.032 00:30:02.032 Latency(us) 00:30:02.032 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.032 =================================================================================================================== 00:30:02.033 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:02.033 09:34:10 compress_compdev -- common/autotest_common.sh@972 -- # wait 253908 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=255580 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:05.322 09:34:13 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 255580 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 255580 ']' 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:05.322 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:05.322 09:34:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:05.322 [2024-07-15 09:34:13.908641] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:30:05.322 [2024-07-15 09:34:13.908712] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid255580 ] 00:30:05.322 [2024-07-15 09:34:14.028494] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:05.322 [2024-07-15 09:34:14.135644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:05.322 [2024-07-15 09:34:14.135651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:06.258 [2024-07-15 09:34:14.892320] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:06.258 09:34:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:06.258 09:34:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:06.258 09:34:14 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:06.258 09:34:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:06.258 09:34:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:06.826 [2024-07-15 09:34:15.527068] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b223c0 PMD being used: compress_qat 00:30:06.826 09:34:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:06.826 09:34:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:07.084 09:34:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:07.084 [ 00:30:07.084 { 00:30:07.084 "name": "Nvme0n1", 00:30:07.084 "aliases": [ 00:30:07.084 "01000000-0000-0000-5cd2-e43197705251" 00:30:07.084 ], 00:30:07.084 "product_name": "NVMe disk", 00:30:07.084 "block_size": 512, 00:30:07.084 "num_blocks": 15002931888, 00:30:07.084 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:07.084 "assigned_rate_limits": { 00:30:07.084 "rw_ios_per_sec": 0, 00:30:07.084 "rw_mbytes_per_sec": 0, 00:30:07.084 "r_mbytes_per_sec": 0, 00:30:07.084 "w_mbytes_per_sec": 0 00:30:07.084 }, 00:30:07.084 "claimed": false, 00:30:07.084 "zoned": false, 00:30:07.084 "supported_io_types": { 00:30:07.084 "read": true, 00:30:07.084 "write": true, 00:30:07.084 "unmap": true, 00:30:07.084 "flush": true, 00:30:07.084 "reset": true, 00:30:07.084 "nvme_admin": true, 00:30:07.084 "nvme_io": true, 00:30:07.084 "nvme_io_md": false, 00:30:07.084 "write_zeroes": true, 00:30:07.084 "zcopy": false, 00:30:07.084 "get_zone_info": false, 00:30:07.084 "zone_management": false, 00:30:07.084 "zone_append": false, 00:30:07.084 "compare": false, 00:30:07.084 "compare_and_write": false, 00:30:07.084 "abort": true, 00:30:07.084 "seek_hole": false, 00:30:07.084 "seek_data": false, 00:30:07.084 "copy": false, 00:30:07.084 "nvme_iov_md": false 00:30:07.084 }, 00:30:07.084 "driver_specific": { 00:30:07.084 "nvme": [ 00:30:07.084 { 00:30:07.084 "pci_address": "0000:5e:00.0", 00:30:07.084 "trid": { 00:30:07.084 "trtype": "PCIe", 00:30:07.084 "traddr": "0000:5e:00.0" 00:30:07.084 }, 00:30:07.084 "ctrlr_data": { 00:30:07.084 "cntlid": 0, 00:30:07.084 "vendor_id": "0x8086", 00:30:07.084 "model_number": "INTEL SSDPF2KX076TZO", 00:30:07.084 "serial_number": "PHAC0301002G7P6CGN", 00:30:07.084 "firmware_revision": "JCV10200", 00:30:07.084 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:07.084 "oacs": { 00:30:07.084 "security": 1, 00:30:07.084 "format": 1, 00:30:07.084 "firmware": 1, 00:30:07.084 "ns_manage": 1 00:30:07.084 }, 00:30:07.084 "multi_ctrlr": false, 00:30:07.084 "ana_reporting": false 00:30:07.084 }, 00:30:07.084 "vs": { 00:30:07.084 "nvme_version": "1.3" 00:30:07.084 }, 00:30:07.084 "ns_data": { 00:30:07.084 "id": 1, 00:30:07.084 "can_share": false 00:30:07.084 }, 00:30:07.084 "security": { 00:30:07.084 "opal": true 00:30:07.084 } 00:30:07.084 } 00:30:07.084 ], 00:30:07.084 "mp_policy": "active_passive" 00:30:07.084 } 00:30:07.084 } 00:30:07.084 ] 00:30:07.343 09:34:16 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:07.343 09:34:16 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:07.343 [2024-07-15 09:34:16.288917] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19870d0 PMD being used: compress_qat 00:30:09.874 de5c4e78-a214-4230-bd56-b7d654084a60 00:30:09.874 09:34:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:10.132 b1882a07-2de8-4e8d-aaf6-39a88a83dd4c 00:30:10.132 09:34:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:10.132 09:34:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:10.422 09:34:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:10.681 [ 00:30:10.681 { 00:30:10.681 "name": "b1882a07-2de8-4e8d-aaf6-39a88a83dd4c", 00:30:10.681 "aliases": [ 00:30:10.681 "lvs0/lv0" 00:30:10.681 ], 00:30:10.681 "product_name": "Logical Volume", 00:30:10.681 "block_size": 512, 00:30:10.681 "num_blocks": 204800, 00:30:10.681 "uuid": "b1882a07-2de8-4e8d-aaf6-39a88a83dd4c", 00:30:10.681 "assigned_rate_limits": { 00:30:10.681 "rw_ios_per_sec": 0, 00:30:10.681 "rw_mbytes_per_sec": 0, 00:30:10.681 "r_mbytes_per_sec": 0, 00:30:10.681 "w_mbytes_per_sec": 0 00:30:10.681 }, 00:30:10.681 "claimed": false, 00:30:10.681 "zoned": false, 00:30:10.681 "supported_io_types": { 00:30:10.681 "read": true, 00:30:10.681 "write": true, 00:30:10.681 "unmap": true, 00:30:10.681 "flush": false, 00:30:10.681 "reset": true, 00:30:10.681 "nvme_admin": false, 00:30:10.681 "nvme_io": false, 00:30:10.681 "nvme_io_md": false, 00:30:10.681 "write_zeroes": true, 00:30:10.681 "zcopy": false, 00:30:10.681 "get_zone_info": false, 00:30:10.681 "zone_management": false, 00:30:10.681 "zone_append": false, 00:30:10.681 "compare": false, 00:30:10.681 "compare_and_write": false, 00:30:10.681 "abort": false, 00:30:10.681 "seek_hole": true, 00:30:10.681 "seek_data": true, 00:30:10.681 "copy": false, 00:30:10.681 "nvme_iov_md": false 00:30:10.681 }, 00:30:10.681 "driver_specific": { 00:30:10.681 "lvol": { 00:30:10.681 "lvol_store_uuid": "de5c4e78-a214-4230-bd56-b7d654084a60", 00:30:10.681 "base_bdev": "Nvme0n1", 00:30:10.681 "thin_provision": true, 00:30:10.681 "num_allocated_clusters": 0, 00:30:10.681 "snapshot": false, 00:30:10.681 "clone": false, 00:30:10.681 "esnap_clone": false 00:30:10.681 } 00:30:10.681 } 00:30:10.681 } 00:30:10.681 ] 00:30:10.681 09:34:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:10.681 09:34:19 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:10.681 09:34:19 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:10.941 [2024-07-15 09:34:19.639696] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:10.941 COMP_lvs0/lv0 00:30:10.941 09:34:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:10.941 09:34:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:11.200 09:34:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:11.200 [ 00:30:11.200 { 00:30:11.200 "name": "COMP_lvs0/lv0", 00:30:11.200 "aliases": [ 00:30:11.200 "eb9c239e-dd91-5ac9-a653-70bcc621f890" 00:30:11.200 ], 00:30:11.200 "product_name": "compress", 00:30:11.200 "block_size": 4096, 00:30:11.200 "num_blocks": 25088, 00:30:11.200 "uuid": "eb9c239e-dd91-5ac9-a653-70bcc621f890", 00:30:11.200 "assigned_rate_limits": { 00:30:11.200 "rw_ios_per_sec": 0, 00:30:11.200 "rw_mbytes_per_sec": 0, 00:30:11.200 "r_mbytes_per_sec": 0, 00:30:11.200 "w_mbytes_per_sec": 0 00:30:11.200 }, 00:30:11.200 "claimed": false, 00:30:11.200 "zoned": false, 00:30:11.200 "supported_io_types": { 00:30:11.200 "read": true, 00:30:11.200 "write": true, 00:30:11.200 "unmap": false, 00:30:11.200 "flush": false, 00:30:11.200 "reset": false, 00:30:11.200 "nvme_admin": false, 00:30:11.200 "nvme_io": false, 00:30:11.200 "nvme_io_md": false, 00:30:11.200 "write_zeroes": true, 00:30:11.200 "zcopy": false, 00:30:11.200 "get_zone_info": false, 00:30:11.200 "zone_management": false, 00:30:11.200 "zone_append": false, 00:30:11.200 "compare": false, 00:30:11.200 "compare_and_write": false, 00:30:11.200 "abort": false, 00:30:11.200 "seek_hole": false, 00:30:11.200 "seek_data": false, 00:30:11.200 "copy": false, 00:30:11.200 "nvme_iov_md": false 00:30:11.200 }, 00:30:11.200 "driver_specific": { 00:30:11.200 "compress": { 00:30:11.200 "name": "COMP_lvs0/lv0", 00:30:11.200 "base_bdev_name": "b1882a07-2de8-4e8d-aaf6-39a88a83dd4c" 00:30:11.200 } 00:30:11.200 } 00:30:11.200 } 00:30:11.200 ] 00:30:11.200 09:34:20 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:11.200 09:34:20 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:11.459 [2024-07-15 09:34:20.258308] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7faae81b15c0 PMD being used: compress_qat 00:30:11.459 [2024-07-15 09:34:20.260599] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b1f700 PMD being used: compress_qat 00:30:11.459 Running I/O for 3 seconds... 00:30:14.772 00:30:14.772 Latency(us) 00:30:14.772 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:14.772 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:14.772 Verification LBA range: start 0x0 length 0x3100 00:30:14.772 COMP_lvs0/lv0 : 3.00 5115.15 19.98 0.00 0.00 6202.02 247.54 6411.13 00:30:14.772 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:14.772 Verification LBA range: start 0x3100 length 0x3100 00:30:14.772 COMP_lvs0/lv0 : 3.00 5363.18 20.95 0.00 0.00 5928.98 174.53 6496.61 00:30:14.772 =================================================================================================================== 00:30:14.772 Total : 10478.33 40.93 0.00 0.00 6062.29 174.53 6496.61 00:30:14.772 0 00:30:14.772 09:34:23 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:14.772 09:34:23 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:14.772 09:34:23 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:15.031 09:34:23 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:15.031 09:34:23 compress_compdev -- compress/compress.sh@78 -- # killprocess 255580 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 255580 ']' 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 255580 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 255580 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 255580' 00:30:15.031 killing process with pid 255580 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@967 -- # kill 255580 00:30:15.031 Received shutdown signal, test time was about 3.000000 seconds 00:30:15.031 00:30:15.031 Latency(us) 00:30:15.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:15.031 =================================================================================================================== 00:30:15.031 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:15.031 09:34:23 compress_compdev -- common/autotest_common.sh@972 -- # wait 255580 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=257190 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:18.321 09:34:26 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 257190 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 257190 ']' 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:18.321 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:18.321 09:34:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:18.321 [2024-07-15 09:34:26.885144] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:30:18.321 [2024-07-15 09:34:26.885215] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid257190 ] 00:30:18.321 [2024-07-15 09:34:27.014337] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:18.321 [2024-07-15 09:34:27.123337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:18.321 [2024-07-15 09:34:27.123423] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:18.321 [2024-07-15 09:34:27.123429] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:19.260 [2024-07-15 09:34:27.882589] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:19.260 09:34:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:19.260 09:34:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:19.260 09:34:27 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:19.260 09:34:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:19.260 09:34:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:19.829 [2024-07-15 09:34:28.535673] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28c6f20 PMD being used: compress_qat 00:30:19.829 09:34:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:19.829 09:34:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:19.830 09:34:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:19.830 09:34:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:19.830 09:34:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:19.830 09:34:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:19.830 09:34:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:20.089 09:34:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:20.089 [ 00:30:20.089 { 00:30:20.089 "name": "Nvme0n1", 00:30:20.089 "aliases": [ 00:30:20.089 "01000000-0000-0000-5cd2-e43197705251" 00:30:20.089 ], 00:30:20.089 "product_name": "NVMe disk", 00:30:20.089 "block_size": 512, 00:30:20.089 "num_blocks": 15002931888, 00:30:20.089 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:20.089 "assigned_rate_limits": { 00:30:20.089 "rw_ios_per_sec": 0, 00:30:20.089 "rw_mbytes_per_sec": 0, 00:30:20.089 "r_mbytes_per_sec": 0, 00:30:20.089 "w_mbytes_per_sec": 0 00:30:20.089 }, 00:30:20.089 "claimed": false, 00:30:20.089 "zoned": false, 00:30:20.089 "supported_io_types": { 00:30:20.089 "read": true, 00:30:20.089 "write": true, 00:30:20.089 "unmap": true, 00:30:20.089 "flush": true, 00:30:20.089 "reset": true, 00:30:20.089 "nvme_admin": true, 00:30:20.089 "nvme_io": true, 00:30:20.089 "nvme_io_md": false, 00:30:20.089 "write_zeroes": true, 00:30:20.089 "zcopy": false, 00:30:20.089 "get_zone_info": false, 00:30:20.089 "zone_management": false, 00:30:20.089 "zone_append": false, 00:30:20.089 "compare": false, 00:30:20.089 "compare_and_write": false, 00:30:20.089 "abort": true, 00:30:20.089 "seek_hole": false, 00:30:20.089 "seek_data": false, 00:30:20.089 "copy": false, 00:30:20.089 "nvme_iov_md": false 00:30:20.089 }, 00:30:20.089 "driver_specific": { 00:30:20.089 "nvme": [ 00:30:20.089 { 00:30:20.089 "pci_address": "0000:5e:00.0", 00:30:20.089 "trid": { 00:30:20.089 "trtype": "PCIe", 00:30:20.089 "traddr": "0000:5e:00.0" 00:30:20.089 }, 00:30:20.089 "ctrlr_data": { 00:30:20.089 "cntlid": 0, 00:30:20.089 "vendor_id": "0x8086", 00:30:20.089 "model_number": "INTEL SSDPF2KX076TZO", 00:30:20.089 "serial_number": "PHAC0301002G7P6CGN", 00:30:20.089 "firmware_revision": "JCV10200", 00:30:20.089 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:20.089 "oacs": { 00:30:20.089 "security": 1, 00:30:20.089 "format": 1, 00:30:20.089 "firmware": 1, 00:30:20.089 "ns_manage": 1 00:30:20.089 }, 00:30:20.089 "multi_ctrlr": false, 00:30:20.089 "ana_reporting": false 00:30:20.089 }, 00:30:20.089 "vs": { 00:30:20.089 "nvme_version": "1.3" 00:30:20.089 }, 00:30:20.089 "ns_data": { 00:30:20.089 "id": 1, 00:30:20.089 "can_share": false 00:30:20.089 }, 00:30:20.089 "security": { 00:30:20.089 "opal": true 00:30:20.089 } 00:30:20.089 } 00:30:20.089 ], 00:30:20.090 "mp_policy": "active_passive" 00:30:20.090 } 00:30:20.090 } 00:30:20.090 ] 00:30:20.090 09:34:29 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:20.090 09:34:29 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:20.349 [2024-07-15 09:34:29.249119] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2715390 PMD being used: compress_qat 00:30:22.885 07a258bc-91f6-46db-a205-e5fc3ffb5e5c 00:30:22.885 09:34:31 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:22.885 74edcfb8-4cba-4f99-a790-62349fb75b56 00:30:22.885 09:34:31 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:22.885 09:34:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.144 09:34:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:23.404 [ 00:30:23.404 { 00:30:23.404 "name": "74edcfb8-4cba-4f99-a790-62349fb75b56", 00:30:23.404 "aliases": [ 00:30:23.404 "lvs0/lv0" 00:30:23.404 ], 00:30:23.404 "product_name": "Logical Volume", 00:30:23.404 "block_size": 512, 00:30:23.404 "num_blocks": 204800, 00:30:23.404 "uuid": "74edcfb8-4cba-4f99-a790-62349fb75b56", 00:30:23.404 "assigned_rate_limits": { 00:30:23.404 "rw_ios_per_sec": 0, 00:30:23.404 "rw_mbytes_per_sec": 0, 00:30:23.404 "r_mbytes_per_sec": 0, 00:30:23.404 "w_mbytes_per_sec": 0 00:30:23.404 }, 00:30:23.404 "claimed": false, 00:30:23.404 "zoned": false, 00:30:23.404 "supported_io_types": { 00:30:23.404 "read": true, 00:30:23.404 "write": true, 00:30:23.404 "unmap": true, 00:30:23.404 "flush": false, 00:30:23.404 "reset": true, 00:30:23.404 "nvme_admin": false, 00:30:23.404 "nvme_io": false, 00:30:23.404 "nvme_io_md": false, 00:30:23.404 "write_zeroes": true, 00:30:23.404 "zcopy": false, 00:30:23.404 "get_zone_info": false, 00:30:23.404 "zone_management": false, 00:30:23.404 "zone_append": false, 00:30:23.404 "compare": false, 00:30:23.404 "compare_and_write": false, 00:30:23.404 "abort": false, 00:30:23.404 "seek_hole": true, 00:30:23.404 "seek_data": true, 00:30:23.404 "copy": false, 00:30:23.404 "nvme_iov_md": false 00:30:23.404 }, 00:30:23.404 "driver_specific": { 00:30:23.404 "lvol": { 00:30:23.404 "lvol_store_uuid": "07a258bc-91f6-46db-a205-e5fc3ffb5e5c", 00:30:23.404 "base_bdev": "Nvme0n1", 00:30:23.404 "thin_provision": true, 00:30:23.404 "num_allocated_clusters": 0, 00:30:23.404 "snapshot": false, 00:30:23.404 "clone": false, 00:30:23.404 "esnap_clone": false 00:30:23.404 } 00:30:23.404 } 00:30:23.404 } 00:30:23.404 ] 00:30:23.404 09:34:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:23.404 09:34:32 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:23.404 09:34:32 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:23.663 [2024-07-15 09:34:32.476872] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:23.663 COMP_lvs0/lv0 00:30:23.663 09:34:32 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:23.663 09:34:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.922 09:34:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:23.922 [ 00:30:23.922 { 00:30:23.922 "name": "COMP_lvs0/lv0", 00:30:23.922 "aliases": [ 00:30:23.922 "e974233b-b56f-53bc-addc-48c744f08c0b" 00:30:23.922 ], 00:30:23.922 "product_name": "compress", 00:30:23.922 "block_size": 512, 00:30:23.922 "num_blocks": 200704, 00:30:23.922 "uuid": "e974233b-b56f-53bc-addc-48c744f08c0b", 00:30:23.922 "assigned_rate_limits": { 00:30:23.922 "rw_ios_per_sec": 0, 00:30:23.922 "rw_mbytes_per_sec": 0, 00:30:23.922 "r_mbytes_per_sec": 0, 00:30:23.922 "w_mbytes_per_sec": 0 00:30:23.922 }, 00:30:23.922 "claimed": false, 00:30:23.922 "zoned": false, 00:30:23.922 "supported_io_types": { 00:30:23.922 "read": true, 00:30:23.922 "write": true, 00:30:23.922 "unmap": false, 00:30:23.922 "flush": false, 00:30:23.922 "reset": false, 00:30:23.922 "nvme_admin": false, 00:30:23.922 "nvme_io": false, 00:30:23.922 "nvme_io_md": false, 00:30:23.922 "write_zeroes": true, 00:30:23.922 "zcopy": false, 00:30:23.922 "get_zone_info": false, 00:30:23.922 "zone_management": false, 00:30:23.922 "zone_append": false, 00:30:23.922 "compare": false, 00:30:23.922 "compare_and_write": false, 00:30:23.922 "abort": false, 00:30:23.922 "seek_hole": false, 00:30:23.922 "seek_data": false, 00:30:23.922 "copy": false, 00:30:23.922 "nvme_iov_md": false 00:30:23.922 }, 00:30:23.922 "driver_specific": { 00:30:23.922 "compress": { 00:30:23.922 "name": "COMP_lvs0/lv0", 00:30:23.922 "base_bdev_name": "74edcfb8-4cba-4f99-a790-62349fb75b56" 00:30:23.922 } 00:30:23.922 } 00:30:23.922 } 00:30:23.922 ] 00:30:24.181 09:34:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:24.181 09:34:32 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:24.181 [2024-07-15 09:34:32.993560] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f51c01b1350 PMD being used: compress_qat 00:30:24.181 I/O targets: 00:30:24.181 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:24.181 00:30:24.181 00:30:24.181 CUnit - A unit testing framework for C - Version 2.1-3 00:30:24.181 http://cunit.sourceforge.net/ 00:30:24.181 00:30:24.181 00:30:24.181 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:24.181 Test: blockdev write read block ...passed 00:30:24.181 Test: blockdev write zeroes read block ...passed 00:30:24.181 Test: blockdev write zeroes read no split ...passed 00:30:24.181 Test: blockdev write zeroes read split ...passed 00:30:24.181 Test: blockdev write zeroes read split partial ...passed 00:30:24.181 Test: blockdev reset ...[2024-07-15 09:34:33.031014] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:24.181 passed 00:30:24.181 Test: blockdev write read 8 blocks ...passed 00:30:24.181 Test: blockdev write read size > 128k ...passed 00:30:24.181 Test: blockdev write read invalid size ...passed 00:30:24.181 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:24.181 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:24.182 Test: blockdev write read max offset ...passed 00:30:24.182 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:24.182 Test: blockdev writev readv 8 blocks ...passed 00:30:24.182 Test: blockdev writev readv 30 x 1block ...passed 00:30:24.182 Test: blockdev writev readv block ...passed 00:30:24.182 Test: blockdev writev readv size > 128k ...passed 00:30:24.182 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:24.182 Test: blockdev comparev and writev ...passed 00:30:24.182 Test: blockdev nvme passthru rw ...passed 00:30:24.182 Test: blockdev nvme passthru vendor specific ...passed 00:30:24.182 Test: blockdev nvme admin passthru ...passed 00:30:24.182 Test: blockdev copy ...passed 00:30:24.182 00:30:24.182 Run Summary: Type Total Ran Passed Failed Inactive 00:30:24.182 suites 1 1 n/a 0 0 00:30:24.182 tests 23 23 23 0 0 00:30:24.182 asserts 130 130 130 0 n/a 00:30:24.182 00:30:24.182 Elapsed time = 0.091 seconds 00:30:24.182 0 00:30:24.182 09:34:33 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:30:24.182 09:34:33 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:24.440 09:34:33 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:24.699 09:34:33 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:24.699 09:34:33 compress_compdev -- compress/compress.sh@62 -- # killprocess 257190 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 257190 ']' 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 257190 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 257190 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 257190' 00:30:24.699 killing process with pid 257190 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@967 -- # kill 257190 00:30:24.699 09:34:33 compress_compdev -- common/autotest_common.sh@972 -- # wait 257190 00:30:27.988 09:34:36 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:27.988 09:34:36 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:27.988 00:30:27.988 real 0m48.448s 00:30:27.988 user 1m51.746s 00:30:27.988 sys 0m5.853s 00:30:27.988 09:34:36 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:27.988 09:34:36 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:27.988 ************************************ 00:30:27.988 END TEST compress_compdev 00:30:27.988 ************************************ 00:30:27.988 09:34:36 -- common/autotest_common.sh@1142 -- # return 0 00:30:27.988 09:34:36 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:27.988 09:34:36 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:27.988 09:34:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:27.988 09:34:36 -- common/autotest_common.sh@10 -- # set +x 00:30:27.988 ************************************ 00:30:27.988 START TEST compress_isal 00:30:27.988 ************************************ 00:30:27.988 09:34:36 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:27.988 * Looking for test storage... 00:30:27.988 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:27.988 09:34:36 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:27.988 09:34:36 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:30:27.988 09:34:36 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:27.988 09:34:36 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:27.988 09:34:36 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:27.989 09:34:36 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:27.989 09:34:36 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:27.989 09:34:36 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:27.989 09:34:36 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:27.989 09:34:36 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:27.989 09:34:36 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:27.989 09:34:36 compress_isal -- paths/export.sh@5 -- # export PATH 00:30:27.989 09:34:36 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@47 -- # : 0 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:27.989 09:34:36 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=258490 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 258490 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 258490 ']' 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:27.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:27.989 09:34:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:27.989 09:34:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:27.989 [2024-07-15 09:34:36.672736] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:30:27.989 [2024-07-15 09:34:36.672810] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid258490 ] 00:30:27.989 [2024-07-15 09:34:36.791986] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:27.989 [2024-07-15 09:34:36.900706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:27.989 [2024-07-15 09:34:36.900713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:28.249 09:34:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:28.249 09:34:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:28.249 09:34:37 compress_isal -- compress/compress.sh@74 -- # create_vols 00:30:28.249 09:34:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:28.249 09:34:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:28.816 09:34:37 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:28.816 09:34:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:29.076 09:34:37 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:29.335 [ 00:30:29.335 { 00:30:29.335 "name": "Nvme0n1", 00:30:29.335 "aliases": [ 00:30:29.335 "01000000-0000-0000-5cd2-e43197705251" 00:30:29.335 ], 00:30:29.335 "product_name": "NVMe disk", 00:30:29.335 "block_size": 512, 00:30:29.335 "num_blocks": 15002931888, 00:30:29.335 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:29.335 "assigned_rate_limits": { 00:30:29.335 "rw_ios_per_sec": 0, 00:30:29.335 "rw_mbytes_per_sec": 0, 00:30:29.335 "r_mbytes_per_sec": 0, 00:30:29.335 "w_mbytes_per_sec": 0 00:30:29.335 }, 00:30:29.335 "claimed": false, 00:30:29.335 "zoned": false, 00:30:29.335 "supported_io_types": { 00:30:29.335 "read": true, 00:30:29.335 "write": true, 00:30:29.335 "unmap": true, 00:30:29.335 "flush": true, 00:30:29.335 "reset": true, 00:30:29.335 "nvme_admin": true, 00:30:29.335 "nvme_io": true, 00:30:29.335 "nvme_io_md": false, 00:30:29.335 "write_zeroes": true, 00:30:29.335 "zcopy": false, 00:30:29.335 "get_zone_info": false, 00:30:29.335 "zone_management": false, 00:30:29.335 "zone_append": false, 00:30:29.335 "compare": false, 00:30:29.335 "compare_and_write": false, 00:30:29.335 "abort": true, 00:30:29.335 "seek_hole": false, 00:30:29.335 "seek_data": false, 00:30:29.335 "copy": false, 00:30:29.335 "nvme_iov_md": false 00:30:29.335 }, 00:30:29.335 "driver_specific": { 00:30:29.335 "nvme": [ 00:30:29.335 { 00:30:29.335 "pci_address": "0000:5e:00.0", 00:30:29.335 "trid": { 00:30:29.335 "trtype": "PCIe", 00:30:29.335 "traddr": "0000:5e:00.0" 00:30:29.335 }, 00:30:29.335 "ctrlr_data": { 00:30:29.335 "cntlid": 0, 00:30:29.335 "vendor_id": "0x8086", 00:30:29.335 "model_number": "INTEL SSDPF2KX076TZO", 00:30:29.335 "serial_number": "PHAC0301002G7P6CGN", 00:30:29.335 "firmware_revision": "JCV10200", 00:30:29.335 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:29.335 "oacs": { 00:30:29.335 "security": 1, 00:30:29.335 "format": 1, 00:30:29.335 "firmware": 1, 00:30:29.335 "ns_manage": 1 00:30:29.335 }, 00:30:29.335 "multi_ctrlr": false, 00:30:29.335 "ana_reporting": false 00:30:29.335 }, 00:30:29.335 "vs": { 00:30:29.335 "nvme_version": "1.3" 00:30:29.335 }, 00:30:29.335 "ns_data": { 00:30:29.335 "id": 1, 00:30:29.335 "can_share": false 00:30:29.335 }, 00:30:29.335 "security": { 00:30:29.335 "opal": true 00:30:29.335 } 00:30:29.335 } 00:30:29.335 ], 00:30:29.335 "mp_policy": "active_passive" 00:30:29.335 } 00:30:29.335 } 00:30:29.335 ] 00:30:29.335 09:34:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:29.335 09:34:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:31.945 c3322585-2e49-4067-b863-19a83c8fd082 00:30:31.945 09:34:40 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:31.945 369de96b-1c68-42c8-b4b6-2b4124ebadfa 00:30:31.945 09:34:40 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:31.945 09:34:40 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:32.204 09:34:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:32.463 [ 00:30:32.463 { 00:30:32.463 "name": "369de96b-1c68-42c8-b4b6-2b4124ebadfa", 00:30:32.463 "aliases": [ 00:30:32.463 "lvs0/lv0" 00:30:32.463 ], 00:30:32.463 "product_name": "Logical Volume", 00:30:32.463 "block_size": 512, 00:30:32.463 "num_blocks": 204800, 00:30:32.463 "uuid": "369de96b-1c68-42c8-b4b6-2b4124ebadfa", 00:30:32.463 "assigned_rate_limits": { 00:30:32.463 "rw_ios_per_sec": 0, 00:30:32.463 "rw_mbytes_per_sec": 0, 00:30:32.463 "r_mbytes_per_sec": 0, 00:30:32.463 "w_mbytes_per_sec": 0 00:30:32.463 }, 00:30:32.463 "claimed": false, 00:30:32.463 "zoned": false, 00:30:32.463 "supported_io_types": { 00:30:32.463 "read": true, 00:30:32.463 "write": true, 00:30:32.463 "unmap": true, 00:30:32.463 "flush": false, 00:30:32.463 "reset": true, 00:30:32.463 "nvme_admin": false, 00:30:32.463 "nvme_io": false, 00:30:32.463 "nvme_io_md": false, 00:30:32.463 "write_zeroes": true, 00:30:32.463 "zcopy": false, 00:30:32.463 "get_zone_info": false, 00:30:32.463 "zone_management": false, 00:30:32.463 "zone_append": false, 00:30:32.463 "compare": false, 00:30:32.463 "compare_and_write": false, 00:30:32.463 "abort": false, 00:30:32.463 "seek_hole": true, 00:30:32.463 "seek_data": true, 00:30:32.463 "copy": false, 00:30:32.463 "nvme_iov_md": false 00:30:32.463 }, 00:30:32.463 "driver_specific": { 00:30:32.463 "lvol": { 00:30:32.463 "lvol_store_uuid": "c3322585-2e49-4067-b863-19a83c8fd082", 00:30:32.463 "base_bdev": "Nvme0n1", 00:30:32.463 "thin_provision": true, 00:30:32.463 "num_allocated_clusters": 0, 00:30:32.463 "snapshot": false, 00:30:32.463 "clone": false, 00:30:32.463 "esnap_clone": false 00:30:32.463 } 00:30:32.463 } 00:30:32.463 } 00:30:32.463 ] 00:30:32.463 09:34:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:32.463 09:34:41 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:32.463 09:34:41 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:33.032 [2024-07-15 09:34:41.819326] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:33.032 COMP_lvs0/lv0 00:30:33.032 09:34:41 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:33.032 09:34:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:33.291 09:34:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:33.551 [ 00:30:33.551 { 00:30:33.551 "name": "COMP_lvs0/lv0", 00:30:33.551 "aliases": [ 00:30:33.551 "c4f4c859-8d8b-5ccf-a19f-d992bd4dc2d3" 00:30:33.551 ], 00:30:33.551 "product_name": "compress", 00:30:33.551 "block_size": 512, 00:30:33.551 "num_blocks": 200704, 00:30:33.551 "uuid": "c4f4c859-8d8b-5ccf-a19f-d992bd4dc2d3", 00:30:33.551 "assigned_rate_limits": { 00:30:33.551 "rw_ios_per_sec": 0, 00:30:33.551 "rw_mbytes_per_sec": 0, 00:30:33.551 "r_mbytes_per_sec": 0, 00:30:33.551 "w_mbytes_per_sec": 0 00:30:33.551 }, 00:30:33.551 "claimed": false, 00:30:33.551 "zoned": false, 00:30:33.551 "supported_io_types": { 00:30:33.551 "read": true, 00:30:33.551 "write": true, 00:30:33.551 "unmap": false, 00:30:33.551 "flush": false, 00:30:33.551 "reset": false, 00:30:33.551 "nvme_admin": false, 00:30:33.551 "nvme_io": false, 00:30:33.551 "nvme_io_md": false, 00:30:33.551 "write_zeroes": true, 00:30:33.551 "zcopy": false, 00:30:33.551 "get_zone_info": false, 00:30:33.551 "zone_management": false, 00:30:33.551 "zone_append": false, 00:30:33.551 "compare": false, 00:30:33.551 "compare_and_write": false, 00:30:33.551 "abort": false, 00:30:33.551 "seek_hole": false, 00:30:33.551 "seek_data": false, 00:30:33.551 "copy": false, 00:30:33.551 "nvme_iov_md": false 00:30:33.551 }, 00:30:33.551 "driver_specific": { 00:30:33.551 "compress": { 00:30:33.551 "name": "COMP_lvs0/lv0", 00:30:33.551 "base_bdev_name": "369de96b-1c68-42c8-b4b6-2b4124ebadfa" 00:30:33.551 } 00:30:33.551 } 00:30:33.551 } 00:30:33.551 ] 00:30:33.551 09:34:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:33.551 09:34:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:33.551 Running I/O for 3 seconds... 00:30:36.833 00:30:36.833 Latency(us) 00:30:36.833 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.833 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:36.833 Verification LBA range: start 0x0 length 0x3100 00:30:36.833 COMP_lvs0/lv0 : 3.00 3958.70 15.46 0.00 0.00 8029.61 730.16 7693.36 00:30:36.833 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:36.833 Verification LBA range: start 0x3100 length 0x3100 00:30:36.833 COMP_lvs0/lv0 : 3.00 3963.62 15.48 0.00 0.00 8033.20 605.50 7807.33 00:30:36.833 =================================================================================================================== 00:30:36.833 Total : 7922.31 30.95 0.00 0.00 8031.41 605.50 7807.33 00:30:36.833 0 00:30:36.833 09:34:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:36.833 09:34:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:36.833 09:34:45 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:37.091 09:34:45 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:37.091 09:34:45 compress_isal -- compress/compress.sh@78 -- # killprocess 258490 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 258490 ']' 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@952 -- # kill -0 258490 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 258490 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 258490' 00:30:37.091 killing process with pid 258490 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@967 -- # kill 258490 00:30:37.091 Received shutdown signal, test time was about 3.000000 seconds 00:30:37.091 00:30:37.091 Latency(us) 00:30:37.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:37.091 =================================================================================================================== 00:30:37.091 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:37.091 09:34:45 compress_isal -- common/autotest_common.sh@972 -- # wait 258490 00:30:40.377 09:34:48 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:40.378 09:34:48 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:40.378 09:34:48 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=260092 00:30:40.378 09:34:48 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:40.378 09:34:48 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:40.378 09:34:48 compress_isal -- compress/compress.sh@73 -- # waitforlisten 260092 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 260092 ']' 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:40.378 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:40.378 09:34:48 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:40.378 [2024-07-15 09:34:48.907121] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:30:40.378 [2024-07-15 09:34:48.907192] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid260092 ] 00:30:40.378 [2024-07-15 09:34:49.025839] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:40.378 [2024-07-15 09:34:49.127983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:40.378 [2024-07-15 09:34:49.127989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:40.953 09:34:49 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:40.953 09:34:49 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:40.953 09:34:49 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:40.953 09:34:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:40.953 09:34:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:41.520 09:34:50 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:41.520 09:34:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:41.779 09:34:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:42.037 [ 00:30:42.037 { 00:30:42.037 "name": "Nvme0n1", 00:30:42.037 "aliases": [ 00:30:42.037 "01000000-0000-0000-5cd2-e43197705251" 00:30:42.037 ], 00:30:42.037 "product_name": "NVMe disk", 00:30:42.037 "block_size": 512, 00:30:42.037 "num_blocks": 15002931888, 00:30:42.038 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:42.038 "assigned_rate_limits": { 00:30:42.038 "rw_ios_per_sec": 0, 00:30:42.038 "rw_mbytes_per_sec": 0, 00:30:42.038 "r_mbytes_per_sec": 0, 00:30:42.038 "w_mbytes_per_sec": 0 00:30:42.038 }, 00:30:42.038 "claimed": false, 00:30:42.038 "zoned": false, 00:30:42.038 "supported_io_types": { 00:30:42.038 "read": true, 00:30:42.038 "write": true, 00:30:42.038 "unmap": true, 00:30:42.038 "flush": true, 00:30:42.038 "reset": true, 00:30:42.038 "nvme_admin": true, 00:30:42.038 "nvme_io": true, 00:30:42.038 "nvme_io_md": false, 00:30:42.038 "write_zeroes": true, 00:30:42.038 "zcopy": false, 00:30:42.038 "get_zone_info": false, 00:30:42.038 "zone_management": false, 00:30:42.038 "zone_append": false, 00:30:42.038 "compare": false, 00:30:42.038 "compare_and_write": false, 00:30:42.038 "abort": true, 00:30:42.038 "seek_hole": false, 00:30:42.038 "seek_data": false, 00:30:42.038 "copy": false, 00:30:42.038 "nvme_iov_md": false 00:30:42.038 }, 00:30:42.038 "driver_specific": { 00:30:42.038 "nvme": [ 00:30:42.038 { 00:30:42.038 "pci_address": "0000:5e:00.0", 00:30:42.038 "trid": { 00:30:42.038 "trtype": "PCIe", 00:30:42.038 "traddr": "0000:5e:00.0" 00:30:42.038 }, 00:30:42.038 "ctrlr_data": { 00:30:42.038 "cntlid": 0, 00:30:42.038 "vendor_id": "0x8086", 00:30:42.038 "model_number": "INTEL SSDPF2KX076TZO", 00:30:42.038 "serial_number": "PHAC0301002G7P6CGN", 00:30:42.038 "firmware_revision": "JCV10200", 00:30:42.038 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:42.038 "oacs": { 00:30:42.038 "security": 1, 00:30:42.038 "format": 1, 00:30:42.038 "firmware": 1, 00:30:42.038 "ns_manage": 1 00:30:42.038 }, 00:30:42.038 "multi_ctrlr": false, 00:30:42.038 "ana_reporting": false 00:30:42.038 }, 00:30:42.038 "vs": { 00:30:42.038 "nvme_version": "1.3" 00:30:42.038 }, 00:30:42.038 "ns_data": { 00:30:42.038 "id": 1, 00:30:42.038 "can_share": false 00:30:42.038 }, 00:30:42.038 "security": { 00:30:42.038 "opal": true 00:30:42.038 } 00:30:42.038 } 00:30:42.038 ], 00:30:42.038 "mp_policy": "active_passive" 00:30:42.038 } 00:30:42.038 } 00:30:42.038 ] 00:30:42.038 09:34:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:42.038 09:34:50 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:44.572 6a866576-c1ae-4be2-a900-176066fde0c4 00:30:44.572 09:34:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:44.830 c8fecb93-d318-4eb0-9f10-819df539c319 00:30:44.830 09:34:53 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:44.830 09:34:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:45.089 09:34:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:45.089 [ 00:30:45.089 { 00:30:45.089 "name": "c8fecb93-d318-4eb0-9f10-819df539c319", 00:30:45.089 "aliases": [ 00:30:45.089 "lvs0/lv0" 00:30:45.089 ], 00:30:45.089 "product_name": "Logical Volume", 00:30:45.089 "block_size": 512, 00:30:45.089 "num_blocks": 204800, 00:30:45.089 "uuid": "c8fecb93-d318-4eb0-9f10-819df539c319", 00:30:45.089 "assigned_rate_limits": { 00:30:45.089 "rw_ios_per_sec": 0, 00:30:45.089 "rw_mbytes_per_sec": 0, 00:30:45.089 "r_mbytes_per_sec": 0, 00:30:45.089 "w_mbytes_per_sec": 0 00:30:45.089 }, 00:30:45.089 "claimed": false, 00:30:45.089 "zoned": false, 00:30:45.089 "supported_io_types": { 00:30:45.089 "read": true, 00:30:45.089 "write": true, 00:30:45.089 "unmap": true, 00:30:45.089 "flush": false, 00:30:45.089 "reset": true, 00:30:45.089 "nvme_admin": false, 00:30:45.089 "nvme_io": false, 00:30:45.089 "nvme_io_md": false, 00:30:45.089 "write_zeroes": true, 00:30:45.089 "zcopy": false, 00:30:45.089 "get_zone_info": false, 00:30:45.089 "zone_management": false, 00:30:45.089 "zone_append": false, 00:30:45.089 "compare": false, 00:30:45.089 "compare_and_write": false, 00:30:45.089 "abort": false, 00:30:45.089 "seek_hole": true, 00:30:45.089 "seek_data": true, 00:30:45.089 "copy": false, 00:30:45.089 "nvme_iov_md": false 00:30:45.089 }, 00:30:45.089 "driver_specific": { 00:30:45.089 "lvol": { 00:30:45.089 "lvol_store_uuid": "6a866576-c1ae-4be2-a900-176066fde0c4", 00:30:45.089 "base_bdev": "Nvme0n1", 00:30:45.089 "thin_provision": true, 00:30:45.089 "num_allocated_clusters": 0, 00:30:45.089 "snapshot": false, 00:30:45.089 "clone": false, 00:30:45.089 "esnap_clone": false 00:30:45.089 } 00:30:45.089 } 00:30:45.089 } 00:30:45.089 ] 00:30:45.089 09:34:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:45.089 09:34:54 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:45.089 09:34:54 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:45.349 [2024-07-15 09:34:54.266907] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:45.349 COMP_lvs0/lv0 00:30:45.349 09:34:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:45.349 09:34:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:45.608 09:34:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:45.868 [ 00:30:45.868 { 00:30:45.868 "name": "COMP_lvs0/lv0", 00:30:45.868 "aliases": [ 00:30:45.868 "abe00edd-94dd-5966-bb98-ccf1524991e6" 00:30:45.868 ], 00:30:45.868 "product_name": "compress", 00:30:45.868 "block_size": 512, 00:30:45.868 "num_blocks": 200704, 00:30:45.868 "uuid": "abe00edd-94dd-5966-bb98-ccf1524991e6", 00:30:45.868 "assigned_rate_limits": { 00:30:45.868 "rw_ios_per_sec": 0, 00:30:45.868 "rw_mbytes_per_sec": 0, 00:30:45.868 "r_mbytes_per_sec": 0, 00:30:45.868 "w_mbytes_per_sec": 0 00:30:45.868 }, 00:30:45.868 "claimed": false, 00:30:45.868 "zoned": false, 00:30:45.868 "supported_io_types": { 00:30:45.868 "read": true, 00:30:45.868 "write": true, 00:30:45.868 "unmap": false, 00:30:45.868 "flush": false, 00:30:45.868 "reset": false, 00:30:45.868 "nvme_admin": false, 00:30:45.868 "nvme_io": false, 00:30:45.868 "nvme_io_md": false, 00:30:45.868 "write_zeroes": true, 00:30:45.868 "zcopy": false, 00:30:45.868 "get_zone_info": false, 00:30:45.868 "zone_management": false, 00:30:45.868 "zone_append": false, 00:30:45.868 "compare": false, 00:30:45.868 "compare_and_write": false, 00:30:45.868 "abort": false, 00:30:45.868 "seek_hole": false, 00:30:45.868 "seek_data": false, 00:30:45.868 "copy": false, 00:30:45.868 "nvme_iov_md": false 00:30:45.868 }, 00:30:45.868 "driver_specific": { 00:30:45.868 "compress": { 00:30:45.868 "name": "COMP_lvs0/lv0", 00:30:45.868 "base_bdev_name": "c8fecb93-d318-4eb0-9f10-819df539c319" 00:30:45.868 } 00:30:45.868 } 00:30:45.868 } 00:30:45.868 ] 00:30:45.868 09:34:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:45.868 09:34:54 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:46.127 Running I/O for 3 seconds... 00:30:49.419 00:30:49.419 Latency(us) 00:30:49.419 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:49.419 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:49.419 Verification LBA range: start 0x0 length 0x3100 00:30:49.419 COMP_lvs0/lv0 : 3.00 3871.27 15.12 0.00 0.00 8208.80 712.35 8548.17 00:30:49.419 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:49.419 Verification LBA range: start 0x3100 length 0x3100 00:30:49.419 COMP_lvs0/lv0 : 3.00 3876.23 15.14 0.00 0.00 8211.35 509.33 8263.23 00:30:49.419 =================================================================================================================== 00:30:49.419 Total : 7747.50 30.26 0.00 0.00 8210.07 509.33 8548.17 00:30:49.419 0 00:30:49.419 09:34:57 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:49.419 09:34:57 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:49.419 09:34:58 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:49.678 09:34:58 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:49.678 09:34:58 compress_isal -- compress/compress.sh@78 -- # killprocess 260092 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 260092 ']' 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@952 -- # kill -0 260092 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 260092 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 260092' 00:30:49.678 killing process with pid 260092 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@967 -- # kill 260092 00:30:49.678 Received shutdown signal, test time was about 3.000000 seconds 00:30:49.678 00:30:49.678 Latency(us) 00:30:49.678 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:49.678 =================================================================================================================== 00:30:49.678 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:49.678 09:34:58 compress_isal -- common/autotest_common.sh@972 -- # wait 260092 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=261691 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:53.015 09:35:01 compress_isal -- compress/compress.sh@73 -- # waitforlisten 261691 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 261691 ']' 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:53.015 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:53.015 09:35:01 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:53.015 [2024-07-15 09:35:01.487160] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:30:53.015 [2024-07-15 09:35:01.487244] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid261691 ] 00:30:53.015 [2024-07-15 09:35:01.605651] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:53.015 [2024-07-15 09:35:01.702818] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:53.015 [2024-07-15 09:35:01.702830] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:53.582 09:35:02 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:53.582 09:35:02 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:53.582 09:35:02 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:53.582 09:35:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:53.582 09:35:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:54.151 09:35:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:54.151 09:35:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:54.409 09:35:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:54.667 [ 00:30:54.667 { 00:30:54.667 "name": "Nvme0n1", 00:30:54.667 "aliases": [ 00:30:54.667 "01000000-0000-0000-5cd2-e43197705251" 00:30:54.667 ], 00:30:54.667 "product_name": "NVMe disk", 00:30:54.667 "block_size": 512, 00:30:54.667 "num_blocks": 15002931888, 00:30:54.667 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:54.667 "assigned_rate_limits": { 00:30:54.667 "rw_ios_per_sec": 0, 00:30:54.667 "rw_mbytes_per_sec": 0, 00:30:54.667 "r_mbytes_per_sec": 0, 00:30:54.667 "w_mbytes_per_sec": 0 00:30:54.667 }, 00:30:54.667 "claimed": false, 00:30:54.667 "zoned": false, 00:30:54.667 "supported_io_types": { 00:30:54.667 "read": true, 00:30:54.667 "write": true, 00:30:54.667 "unmap": true, 00:30:54.667 "flush": true, 00:30:54.667 "reset": true, 00:30:54.667 "nvme_admin": true, 00:30:54.667 "nvme_io": true, 00:30:54.667 "nvme_io_md": false, 00:30:54.667 "write_zeroes": true, 00:30:54.667 "zcopy": false, 00:30:54.667 "get_zone_info": false, 00:30:54.667 "zone_management": false, 00:30:54.667 "zone_append": false, 00:30:54.667 "compare": false, 00:30:54.667 "compare_and_write": false, 00:30:54.667 "abort": true, 00:30:54.667 "seek_hole": false, 00:30:54.667 "seek_data": false, 00:30:54.667 "copy": false, 00:30:54.667 "nvme_iov_md": false 00:30:54.667 }, 00:30:54.667 "driver_specific": { 00:30:54.667 "nvme": [ 00:30:54.667 { 00:30:54.667 "pci_address": "0000:5e:00.0", 00:30:54.667 "trid": { 00:30:54.667 "trtype": "PCIe", 00:30:54.667 "traddr": "0000:5e:00.0" 00:30:54.667 }, 00:30:54.667 "ctrlr_data": { 00:30:54.667 "cntlid": 0, 00:30:54.667 "vendor_id": "0x8086", 00:30:54.667 "model_number": "INTEL SSDPF2KX076TZO", 00:30:54.667 "serial_number": "PHAC0301002G7P6CGN", 00:30:54.667 "firmware_revision": "JCV10200", 00:30:54.667 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:54.667 "oacs": { 00:30:54.667 "security": 1, 00:30:54.667 "format": 1, 00:30:54.667 "firmware": 1, 00:30:54.667 "ns_manage": 1 00:30:54.667 }, 00:30:54.667 "multi_ctrlr": false, 00:30:54.667 "ana_reporting": false 00:30:54.667 }, 00:30:54.667 "vs": { 00:30:54.667 "nvme_version": "1.3" 00:30:54.667 }, 00:30:54.667 "ns_data": { 00:30:54.667 "id": 1, 00:30:54.667 "can_share": false 00:30:54.667 }, 00:30:54.667 "security": { 00:30:54.667 "opal": true 00:30:54.667 } 00:30:54.667 } 00:30:54.667 ], 00:30:54.667 "mp_policy": "active_passive" 00:30:54.667 } 00:30:54.667 } 00:30:54.667 ] 00:30:54.667 09:35:03 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:54.667 09:35:03 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:57.202 ae752f1d-ff16-458f-a8a1-d049d1ac205c 00:30:57.202 09:35:05 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:57.460 59b662cf-cddd-414b-8395-11a2e8106bef 00:30:57.460 09:35:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:57.460 09:35:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:57.719 [ 00:30:57.719 { 00:30:57.719 "name": "59b662cf-cddd-414b-8395-11a2e8106bef", 00:30:57.719 "aliases": [ 00:30:57.719 "lvs0/lv0" 00:30:57.719 ], 00:30:57.719 "product_name": "Logical Volume", 00:30:57.719 "block_size": 512, 00:30:57.719 "num_blocks": 204800, 00:30:57.719 "uuid": "59b662cf-cddd-414b-8395-11a2e8106bef", 00:30:57.719 "assigned_rate_limits": { 00:30:57.719 "rw_ios_per_sec": 0, 00:30:57.719 "rw_mbytes_per_sec": 0, 00:30:57.719 "r_mbytes_per_sec": 0, 00:30:57.719 "w_mbytes_per_sec": 0 00:30:57.719 }, 00:30:57.719 "claimed": false, 00:30:57.719 "zoned": false, 00:30:57.719 "supported_io_types": { 00:30:57.719 "read": true, 00:30:57.719 "write": true, 00:30:57.719 "unmap": true, 00:30:57.719 "flush": false, 00:30:57.719 "reset": true, 00:30:57.719 "nvme_admin": false, 00:30:57.719 "nvme_io": false, 00:30:57.719 "nvme_io_md": false, 00:30:57.719 "write_zeroes": true, 00:30:57.719 "zcopy": false, 00:30:57.719 "get_zone_info": false, 00:30:57.719 "zone_management": false, 00:30:57.719 "zone_append": false, 00:30:57.719 "compare": false, 00:30:57.719 "compare_and_write": false, 00:30:57.719 "abort": false, 00:30:57.719 "seek_hole": true, 00:30:57.719 "seek_data": true, 00:30:57.719 "copy": false, 00:30:57.719 "nvme_iov_md": false 00:30:57.719 }, 00:30:57.719 "driver_specific": { 00:30:57.719 "lvol": { 00:30:57.719 "lvol_store_uuid": "ae752f1d-ff16-458f-a8a1-d049d1ac205c", 00:30:57.719 "base_bdev": "Nvme0n1", 00:30:57.719 "thin_provision": true, 00:30:57.719 "num_allocated_clusters": 0, 00:30:57.719 "snapshot": false, 00:30:57.719 "clone": false, 00:30:57.719 "esnap_clone": false 00:30:57.719 } 00:30:57.719 } 00:30:57.719 } 00:30:57.719 ] 00:30:57.719 09:35:06 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:57.719 09:35:06 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:57.719 09:35:06 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:57.978 [2024-07-15 09:35:06.883069] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:57.978 COMP_lvs0/lv0 00:30:57.978 09:35:06 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.978 09:35:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:58.237 09:35:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:58.496 [ 00:30:58.496 { 00:30:58.496 "name": "COMP_lvs0/lv0", 00:30:58.496 "aliases": [ 00:30:58.496 "4d2c7ac4-31ff-53e3-8710-58381973c4d2" 00:30:58.496 ], 00:30:58.496 "product_name": "compress", 00:30:58.496 "block_size": 4096, 00:30:58.496 "num_blocks": 25088, 00:30:58.496 "uuid": "4d2c7ac4-31ff-53e3-8710-58381973c4d2", 00:30:58.496 "assigned_rate_limits": { 00:30:58.496 "rw_ios_per_sec": 0, 00:30:58.496 "rw_mbytes_per_sec": 0, 00:30:58.496 "r_mbytes_per_sec": 0, 00:30:58.496 "w_mbytes_per_sec": 0 00:30:58.496 }, 00:30:58.496 "claimed": false, 00:30:58.496 "zoned": false, 00:30:58.496 "supported_io_types": { 00:30:58.496 "read": true, 00:30:58.496 "write": true, 00:30:58.496 "unmap": false, 00:30:58.496 "flush": false, 00:30:58.496 "reset": false, 00:30:58.496 "nvme_admin": false, 00:30:58.496 "nvme_io": false, 00:30:58.496 "nvme_io_md": false, 00:30:58.496 "write_zeroes": true, 00:30:58.496 "zcopy": false, 00:30:58.496 "get_zone_info": false, 00:30:58.496 "zone_management": false, 00:30:58.496 "zone_append": false, 00:30:58.496 "compare": false, 00:30:58.496 "compare_and_write": false, 00:30:58.496 "abort": false, 00:30:58.496 "seek_hole": false, 00:30:58.496 "seek_data": false, 00:30:58.496 "copy": false, 00:30:58.496 "nvme_iov_md": false 00:30:58.496 }, 00:30:58.496 "driver_specific": { 00:30:58.496 "compress": { 00:30:58.496 "name": "COMP_lvs0/lv0", 00:30:58.496 "base_bdev_name": "59b662cf-cddd-414b-8395-11a2e8106bef" 00:30:58.496 } 00:30:58.496 } 00:30:58.496 } 00:30:58.496 ] 00:30:58.496 09:35:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:58.496 09:35:07 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:58.754 Running I/O for 3 seconds... 00:31:02.038 00:31:02.038 Latency(us) 00:31:02.038 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:02.038 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:02.038 Verification LBA range: start 0x0 length 0x3100 00:31:02.038 COMP_lvs0/lv0 : 3.00 3912.89 15.28 0.00 0.00 8124.37 712.35 8206.25 00:31:02.038 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:02.039 Verification LBA range: start 0x3100 length 0x3100 00:31:02.039 COMP_lvs0/lv0 : 3.00 3917.89 15.30 0.00 0.00 8126.67 495.08 8092.27 00:31:02.039 =================================================================================================================== 00:31:02.039 Total : 7830.78 30.59 0.00 0.00 8125.52 495.08 8206.25 00:31:02.039 0 00:31:02.039 09:35:10 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:02.039 09:35:10 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:02.039 09:35:10 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:02.297 09:35:11 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:02.297 09:35:11 compress_isal -- compress/compress.sh@78 -- # killprocess 261691 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 261691 ']' 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@952 -- # kill -0 261691 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 261691 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 261691' 00:31:02.297 killing process with pid 261691 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@967 -- # kill 261691 00:31:02.297 Received shutdown signal, test time was about 3.000000 seconds 00:31:02.297 00:31:02.297 Latency(us) 00:31:02.297 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:02.297 =================================================================================================================== 00:31:02.297 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:02.297 09:35:11 compress_isal -- common/autotest_common.sh@972 -- # wait 261691 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=263293 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@57 -- # waitforlisten 263293 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 263293 ']' 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:05.586 09:35:14 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:05.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:05.586 09:35:14 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:05.586 [2024-07-15 09:35:14.136091] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:31:05.586 [2024-07-15 09:35:14.136162] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid263293 ] 00:31:05.586 [2024-07-15 09:35:14.263995] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:05.586 [2024-07-15 09:35:14.372666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:05.586 [2024-07-15 09:35:14.372751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:05.586 [2024-07-15 09:35:14.372755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:06.153 09:35:15 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:06.153 09:35:15 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:06.153 09:35:15 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:06.153 09:35:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:06.153 09:35:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:06.721 09:35:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:06.721 09:35:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:06.981 09:35:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:07.239 [ 00:31:07.239 { 00:31:07.239 "name": "Nvme0n1", 00:31:07.239 "aliases": [ 00:31:07.239 "01000000-0000-0000-5cd2-e43197705251" 00:31:07.239 ], 00:31:07.239 "product_name": "NVMe disk", 00:31:07.239 "block_size": 512, 00:31:07.239 "num_blocks": 15002931888, 00:31:07.239 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:07.239 "assigned_rate_limits": { 00:31:07.239 "rw_ios_per_sec": 0, 00:31:07.239 "rw_mbytes_per_sec": 0, 00:31:07.239 "r_mbytes_per_sec": 0, 00:31:07.239 "w_mbytes_per_sec": 0 00:31:07.239 }, 00:31:07.239 "claimed": false, 00:31:07.239 "zoned": false, 00:31:07.239 "supported_io_types": { 00:31:07.239 "read": true, 00:31:07.239 "write": true, 00:31:07.239 "unmap": true, 00:31:07.239 "flush": true, 00:31:07.239 "reset": true, 00:31:07.239 "nvme_admin": true, 00:31:07.239 "nvme_io": true, 00:31:07.239 "nvme_io_md": false, 00:31:07.239 "write_zeroes": true, 00:31:07.239 "zcopy": false, 00:31:07.239 "get_zone_info": false, 00:31:07.239 "zone_management": false, 00:31:07.239 "zone_append": false, 00:31:07.239 "compare": false, 00:31:07.239 "compare_and_write": false, 00:31:07.239 "abort": true, 00:31:07.239 "seek_hole": false, 00:31:07.239 "seek_data": false, 00:31:07.239 "copy": false, 00:31:07.239 "nvme_iov_md": false 00:31:07.239 }, 00:31:07.239 "driver_specific": { 00:31:07.239 "nvme": [ 00:31:07.239 { 00:31:07.239 "pci_address": "0000:5e:00.0", 00:31:07.239 "trid": { 00:31:07.239 "trtype": "PCIe", 00:31:07.239 "traddr": "0000:5e:00.0" 00:31:07.239 }, 00:31:07.239 "ctrlr_data": { 00:31:07.239 "cntlid": 0, 00:31:07.239 "vendor_id": "0x8086", 00:31:07.239 "model_number": "INTEL SSDPF2KX076TZO", 00:31:07.239 "serial_number": "PHAC0301002G7P6CGN", 00:31:07.240 "firmware_revision": "JCV10200", 00:31:07.240 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:07.240 "oacs": { 00:31:07.240 "security": 1, 00:31:07.240 "format": 1, 00:31:07.240 "firmware": 1, 00:31:07.240 "ns_manage": 1 00:31:07.240 }, 00:31:07.240 "multi_ctrlr": false, 00:31:07.240 "ana_reporting": false 00:31:07.240 }, 00:31:07.240 "vs": { 00:31:07.240 "nvme_version": "1.3" 00:31:07.240 }, 00:31:07.240 "ns_data": { 00:31:07.240 "id": 1, 00:31:07.240 "can_share": false 00:31:07.240 }, 00:31:07.240 "security": { 00:31:07.240 "opal": true 00:31:07.240 } 00:31:07.240 } 00:31:07.240 ], 00:31:07.240 "mp_policy": "active_passive" 00:31:07.240 } 00:31:07.240 } 00:31:07.240 ] 00:31:07.240 09:35:16 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:07.240 09:35:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:09.778 b70fd4e4-a0f0-4ebe-8ff2-25228c5ebbfb 00:31:09.778 09:35:18 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:10.040 e54ac111-fe67-4770-a1a7-81d2c16d1ece 00:31:10.040 09:35:18 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:10.040 09:35:18 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:10.299 09:35:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:10.299 [ 00:31:10.299 { 00:31:10.299 "name": "e54ac111-fe67-4770-a1a7-81d2c16d1ece", 00:31:10.299 "aliases": [ 00:31:10.299 "lvs0/lv0" 00:31:10.299 ], 00:31:10.299 "product_name": "Logical Volume", 00:31:10.299 "block_size": 512, 00:31:10.299 "num_blocks": 204800, 00:31:10.299 "uuid": "e54ac111-fe67-4770-a1a7-81d2c16d1ece", 00:31:10.299 "assigned_rate_limits": { 00:31:10.299 "rw_ios_per_sec": 0, 00:31:10.299 "rw_mbytes_per_sec": 0, 00:31:10.299 "r_mbytes_per_sec": 0, 00:31:10.299 "w_mbytes_per_sec": 0 00:31:10.299 }, 00:31:10.299 "claimed": false, 00:31:10.299 "zoned": false, 00:31:10.299 "supported_io_types": { 00:31:10.299 "read": true, 00:31:10.299 "write": true, 00:31:10.299 "unmap": true, 00:31:10.299 "flush": false, 00:31:10.299 "reset": true, 00:31:10.299 "nvme_admin": false, 00:31:10.299 "nvme_io": false, 00:31:10.299 "nvme_io_md": false, 00:31:10.299 "write_zeroes": true, 00:31:10.299 "zcopy": false, 00:31:10.299 "get_zone_info": false, 00:31:10.299 "zone_management": false, 00:31:10.299 "zone_append": false, 00:31:10.299 "compare": false, 00:31:10.299 "compare_and_write": false, 00:31:10.299 "abort": false, 00:31:10.299 "seek_hole": true, 00:31:10.299 "seek_data": true, 00:31:10.299 "copy": false, 00:31:10.299 "nvme_iov_md": false 00:31:10.299 }, 00:31:10.299 "driver_specific": { 00:31:10.299 "lvol": { 00:31:10.299 "lvol_store_uuid": "b70fd4e4-a0f0-4ebe-8ff2-25228c5ebbfb", 00:31:10.299 "base_bdev": "Nvme0n1", 00:31:10.299 "thin_provision": true, 00:31:10.299 "num_allocated_clusters": 0, 00:31:10.299 "snapshot": false, 00:31:10.299 "clone": false, 00:31:10.299 "esnap_clone": false 00:31:10.299 } 00:31:10.299 } 00:31:10.299 } 00:31:10.299 ] 00:31:10.299 09:35:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:10.299 09:35:19 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:10.299 09:35:19 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:10.558 [2024-07-15 09:35:19.444073] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:10.558 COMP_lvs0/lv0 00:31:10.558 09:35:19 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:10.558 09:35:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:10.818 09:35:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:11.129 [ 00:31:11.129 { 00:31:11.129 "name": "COMP_lvs0/lv0", 00:31:11.129 "aliases": [ 00:31:11.129 "98fd895a-3eb6-5661-98a7-9a076b067260" 00:31:11.129 ], 00:31:11.129 "product_name": "compress", 00:31:11.129 "block_size": 512, 00:31:11.129 "num_blocks": 200704, 00:31:11.129 "uuid": "98fd895a-3eb6-5661-98a7-9a076b067260", 00:31:11.129 "assigned_rate_limits": { 00:31:11.129 "rw_ios_per_sec": 0, 00:31:11.129 "rw_mbytes_per_sec": 0, 00:31:11.129 "r_mbytes_per_sec": 0, 00:31:11.129 "w_mbytes_per_sec": 0 00:31:11.129 }, 00:31:11.129 "claimed": false, 00:31:11.129 "zoned": false, 00:31:11.129 "supported_io_types": { 00:31:11.129 "read": true, 00:31:11.129 "write": true, 00:31:11.129 "unmap": false, 00:31:11.129 "flush": false, 00:31:11.129 "reset": false, 00:31:11.129 "nvme_admin": false, 00:31:11.129 "nvme_io": false, 00:31:11.129 "nvme_io_md": false, 00:31:11.129 "write_zeroes": true, 00:31:11.129 "zcopy": false, 00:31:11.129 "get_zone_info": false, 00:31:11.129 "zone_management": false, 00:31:11.129 "zone_append": false, 00:31:11.129 "compare": false, 00:31:11.129 "compare_and_write": false, 00:31:11.129 "abort": false, 00:31:11.129 "seek_hole": false, 00:31:11.129 "seek_data": false, 00:31:11.129 "copy": false, 00:31:11.129 "nvme_iov_md": false 00:31:11.129 }, 00:31:11.129 "driver_specific": { 00:31:11.129 "compress": { 00:31:11.129 "name": "COMP_lvs0/lv0", 00:31:11.129 "base_bdev_name": "e54ac111-fe67-4770-a1a7-81d2c16d1ece" 00:31:11.129 } 00:31:11.129 } 00:31:11.129 } 00:31:11.129 ] 00:31:11.129 09:35:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:11.129 09:35:19 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:11.405 I/O targets: 00:31:11.405 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:11.405 00:31:11.405 00:31:11.405 CUnit - A unit testing framework for C - Version 2.1-3 00:31:11.405 http://cunit.sourceforge.net/ 00:31:11.405 00:31:11.405 00:31:11.405 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:11.405 Test: blockdev write read block ...passed 00:31:11.405 Test: blockdev write zeroes read block ...passed 00:31:11.405 Test: blockdev write zeroes read no split ...passed 00:31:11.405 Test: blockdev write zeroes read split ...passed 00:31:11.405 Test: blockdev write zeroes read split partial ...passed 00:31:11.405 Test: blockdev reset ...[2024-07-15 09:35:20.101476] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:11.405 passed 00:31:11.405 Test: blockdev write read 8 blocks ...passed 00:31:11.405 Test: blockdev write read size > 128k ...passed 00:31:11.405 Test: blockdev write read invalid size ...passed 00:31:11.405 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:11.405 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:11.405 Test: blockdev write read max offset ...passed 00:31:11.405 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:11.405 Test: blockdev writev readv 8 blocks ...passed 00:31:11.405 Test: blockdev writev readv 30 x 1block ...passed 00:31:11.405 Test: blockdev writev readv block ...passed 00:31:11.405 Test: blockdev writev readv size > 128k ...passed 00:31:11.405 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:11.405 Test: blockdev comparev and writev ...passed 00:31:11.405 Test: blockdev nvme passthru rw ...passed 00:31:11.405 Test: blockdev nvme passthru vendor specific ...passed 00:31:11.405 Test: blockdev nvme admin passthru ...passed 00:31:11.405 Test: blockdev copy ...passed 00:31:11.405 00:31:11.405 Run Summary: Type Total Ran Passed Failed Inactive 00:31:11.405 suites 1 1 n/a 0 0 00:31:11.405 tests 23 23 23 0 0 00:31:11.405 asserts 130 130 130 0 n/a 00:31:11.405 00:31:11.405 Elapsed time = 0.112 seconds 00:31:11.405 0 00:31:11.405 09:35:20 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:11.405 09:35:20 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:11.664 09:35:20 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:11.924 09:35:20 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:11.924 09:35:20 compress_isal -- compress/compress.sh@62 -- # killprocess 263293 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 263293 ']' 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@952 -- # kill -0 263293 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 263293 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 263293' 00:31:11.924 killing process with pid 263293 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@967 -- # kill 263293 00:31:11.924 09:35:20 compress_isal -- common/autotest_common.sh@972 -- # wait 263293 00:31:15.232 09:35:23 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:15.232 09:35:23 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:15.232 00:31:15.232 real 0m47.179s 00:31:15.232 user 1m50.729s 00:31:15.232 sys 0m4.127s 00:31:15.232 09:35:23 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:15.232 09:35:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:15.232 ************************************ 00:31:15.232 END TEST compress_isal 00:31:15.232 ************************************ 00:31:15.232 09:35:23 -- common/autotest_common.sh@1142 -- # return 0 00:31:15.232 09:35:23 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:15.232 09:35:23 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:15.232 09:35:23 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:15.232 09:35:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:15.232 09:35:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:15.232 09:35:23 -- common/autotest_common.sh@10 -- # set +x 00:31:15.232 ************************************ 00:31:15.232 START TEST blockdev_crypto_aesni 00:31:15.232 ************************************ 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:15.232 * Looking for test storage... 00:31:15.232 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=264592 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 264592 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 264592 ']' 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:15.232 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:15.232 09:35:23 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:15.232 09:35:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:15.232 [2024-07-15 09:35:23.943212] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:31:15.232 [2024-07-15 09:35:23.943295] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid264592 ] 00:31:15.232 [2024-07-15 09:35:24.074064] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.232 [2024-07-15 09:35:24.179485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:16.168 09:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:16.168 09:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:16.168 09:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:16.168 09:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:16.168 09:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:16.168 09:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.168 09:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:16.168 [2024-07-15 09:35:24.857652] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:16.168 [2024-07-15 09:35:24.865687] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:16.168 [2024-07-15 09:35:24.873705] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:16.168 [2024-07-15 09:35:24.947601] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:18.703 true 00:31:18.703 true 00:31:18.703 true 00:31:18.703 true 00:31:18.703 Malloc0 00:31:18.703 Malloc1 00:31:18.703 Malloc2 00:31:18.703 Malloc3 00:31:18.703 [2024-07-15 09:35:27.331442] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:18.703 crypto_ram 00:31:18.703 [2024-07-15 09:35:27.339457] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:18.703 crypto_ram2 00:31:18.703 [2024-07-15 09:35:27.347479] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:18.703 crypto_ram3 00:31:18.703 [2024-07-15 09:35:27.355504] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:18.703 crypto_ram4 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.703 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.703 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:18.703 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.703 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.703 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:18.703 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ad43bf6-6552-58b0-af97-3f22e5a48346"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ad43bf6-6552-58b0-af97-3f22e5a48346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "de7e663c-1e56-5715-b6c4-d8420e5c86d0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de7e663c-1e56-5715-b6c4-d8420e5c86d0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cccae811-5fac-577b-b846-25e035202f58"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cccae811-5fac-577b-b846-25e035202f58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3b93c472-4643-5a01-8400-4b0d2352a72c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b93c472-4643-5a01-8400-4b0d2352a72c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:18.704 09:35:27 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 264592 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 264592 ']' 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 264592 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 264592 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 264592' 00:31:18.704 killing process with pid 264592 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 264592 00:31:18.704 09:35:27 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 264592 00:31:19.641 09:35:28 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:19.641 09:35:28 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:19.641 09:35:28 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:19.641 09:35:28 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:19.641 09:35:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:19.641 ************************************ 00:31:19.641 START TEST bdev_hello_world 00:31:19.641 ************************************ 00:31:19.641 09:35:28 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:19.641 [2024-07-15 09:35:28.332372] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:31:19.641 [2024-07-15 09:35:28.332440] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265186 ] 00:31:19.641 [2024-07-15 09:35:28.464846] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:19.641 [2024-07-15 09:35:28.576026] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:19.900 [2024-07-15 09:35:28.597322] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:19.900 [2024-07-15 09:35:28.605349] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:19.900 [2024-07-15 09:35:28.613373] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:19.900 [2024-07-15 09:35:28.718895] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:22.435 [2024-07-15 09:35:30.956681] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:22.435 [2024-07-15 09:35:30.956761] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:22.435 [2024-07-15 09:35:30.956778] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:22.435 [2024-07-15 09:35:30.964699] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:22.435 [2024-07-15 09:35:30.964719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:22.435 [2024-07-15 09:35:30.964731] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:22.435 [2024-07-15 09:35:30.972721] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:22.435 [2024-07-15 09:35:30.972743] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:22.435 [2024-07-15 09:35:30.972755] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:22.435 [2024-07-15 09:35:30.980739] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:22.435 [2024-07-15 09:35:30.980759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:22.435 [2024-07-15 09:35:30.980771] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:22.435 [2024-07-15 09:35:31.058711] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:22.435 [2024-07-15 09:35:31.058760] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:22.435 [2024-07-15 09:35:31.058780] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:22.435 [2024-07-15 09:35:31.060057] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:22.435 [2024-07-15 09:35:31.060131] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:22.435 [2024-07-15 09:35:31.060148] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:22.435 [2024-07-15 09:35:31.060193] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:22.435 00:31:22.435 [2024-07-15 09:35:31.060212] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:22.694 00:31:22.694 real 0m3.220s 00:31:22.694 user 0m2.782s 00:31:22.694 sys 0m0.401s 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:22.694 ************************************ 00:31:22.694 END TEST bdev_hello_world 00:31:22.694 ************************************ 00:31:22.694 09:35:31 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:22.694 09:35:31 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:22.694 09:35:31 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:22.694 09:35:31 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:22.694 09:35:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.694 ************************************ 00:31:22.694 START TEST bdev_bounds 00:31:22.694 ************************************ 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=265671 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 265671' 00:31:22.694 Process bdevio pid: 265671 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 265671 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 265671 ']' 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:22.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:22.694 09:35:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:22.694 [2024-07-15 09:35:31.638303] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:31:22.694 [2024-07-15 09:35:31.638374] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265671 ] 00:31:22.953 [2024-07-15 09:35:31.765991] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:22.953 [2024-07-15 09:35:31.871830] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:22.953 [2024-07-15 09:35:31.871914] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:22.953 [2024-07-15 09:35:31.871918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:22.953 [2024-07-15 09:35:31.893267] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:22.953 [2024-07-15 09:35:31.901285] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:23.213 [2024-07-15 09:35:31.909305] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:23.213 [2024-07-15 09:35:32.021976] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:25.748 [2024-07-15 09:35:34.237144] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:25.748 [2024-07-15 09:35:34.237226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:25.748 [2024-07-15 09:35:34.237241] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.748 [2024-07-15 09:35:34.245159] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:25.748 [2024-07-15 09:35:34.245178] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:25.748 [2024-07-15 09:35:34.245190] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.748 [2024-07-15 09:35:34.253180] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:25.748 [2024-07-15 09:35:34.253201] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:25.748 [2024-07-15 09:35:34.253212] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.748 [2024-07-15 09:35:34.261201] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:25.748 [2024-07-15 09:35:34.261230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:25.748 [2024-07-15 09:35:34.261241] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.748 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:25.748 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:31:25.748 09:35:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:25.748 I/O targets: 00:31:25.748 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:31:25.748 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:31:25.748 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:31:25.748 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:31:25.748 00:31:25.748 00:31:25.748 CUnit - A unit testing framework for C - Version 2.1-3 00:31:25.748 http://cunit.sourceforge.net/ 00:31:25.748 00:31:25.748 00:31:25.748 Suite: bdevio tests on: crypto_ram4 00:31:25.748 Test: blockdev write read block ...passed 00:31:25.748 Test: blockdev write zeroes read block ...passed 00:31:25.748 Test: blockdev write zeroes read no split ...passed 00:31:25.748 Test: blockdev write zeroes read split ...passed 00:31:25.748 Test: blockdev write zeroes read split partial ...passed 00:31:25.748 Test: blockdev reset ...passed 00:31:25.748 Test: blockdev write read 8 blocks ...passed 00:31:25.748 Test: blockdev write read size > 128k ...passed 00:31:25.748 Test: blockdev write read invalid size ...passed 00:31:25.748 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:25.748 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:25.748 Test: blockdev write read max offset ...passed 00:31:25.748 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:25.748 Test: blockdev writev readv 8 blocks ...passed 00:31:25.748 Test: blockdev writev readv 30 x 1block ...passed 00:31:25.748 Test: blockdev writev readv block ...passed 00:31:25.748 Test: blockdev writev readv size > 128k ...passed 00:31:25.748 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:25.748 Test: blockdev comparev and writev ...passed 00:31:25.748 Test: blockdev nvme passthru rw ...passed 00:31:25.748 Test: blockdev nvme passthru vendor specific ...passed 00:31:25.748 Test: blockdev nvme admin passthru ...passed 00:31:25.748 Test: blockdev copy ...passed 00:31:25.748 Suite: bdevio tests on: crypto_ram3 00:31:25.748 Test: blockdev write read block ...passed 00:31:25.748 Test: blockdev write zeroes read block ...passed 00:31:25.748 Test: blockdev write zeroes read no split ...passed 00:31:25.748 Test: blockdev write zeroes read split ...passed 00:31:25.748 Test: blockdev write zeroes read split partial ...passed 00:31:25.748 Test: blockdev reset ...passed 00:31:25.748 Test: blockdev write read 8 blocks ...passed 00:31:25.748 Test: blockdev write read size > 128k ...passed 00:31:25.748 Test: blockdev write read invalid size ...passed 00:31:25.748 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:25.748 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:25.748 Test: blockdev write read max offset ...passed 00:31:25.748 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:25.748 Test: blockdev writev readv 8 blocks ...passed 00:31:25.748 Test: blockdev writev readv 30 x 1block ...passed 00:31:25.748 Test: blockdev writev readv block ...passed 00:31:25.748 Test: blockdev writev readv size > 128k ...passed 00:31:25.748 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:25.748 Test: blockdev comparev and writev ...passed 00:31:25.748 Test: blockdev nvme passthru rw ...passed 00:31:25.748 Test: blockdev nvme passthru vendor specific ...passed 00:31:25.748 Test: blockdev nvme admin passthru ...passed 00:31:25.748 Test: blockdev copy ...passed 00:31:25.748 Suite: bdevio tests on: crypto_ram2 00:31:25.748 Test: blockdev write read block ...passed 00:31:25.748 Test: blockdev write zeroes read block ...passed 00:31:25.748 Test: blockdev write zeroes read no split ...passed 00:31:25.748 Test: blockdev write zeroes read split ...passed 00:31:25.748 Test: blockdev write zeroes read split partial ...passed 00:31:25.748 Test: blockdev reset ...passed 00:31:25.748 Test: blockdev write read 8 blocks ...passed 00:31:25.748 Test: blockdev write read size > 128k ...passed 00:31:25.748 Test: blockdev write read invalid size ...passed 00:31:25.748 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:25.748 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:25.748 Test: blockdev write read max offset ...passed 00:31:25.748 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:25.748 Test: blockdev writev readv 8 blocks ...passed 00:31:25.748 Test: blockdev writev readv 30 x 1block ...passed 00:31:25.748 Test: blockdev writev readv block ...passed 00:31:25.748 Test: blockdev writev readv size > 128k ...passed 00:31:25.748 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:25.748 Test: blockdev comparev and writev ...passed 00:31:25.748 Test: blockdev nvme passthru rw ...passed 00:31:25.748 Test: blockdev nvme passthru vendor specific ...passed 00:31:25.748 Test: blockdev nvme admin passthru ...passed 00:31:25.749 Test: blockdev copy ...passed 00:31:25.749 Suite: bdevio tests on: crypto_ram 00:31:25.749 Test: blockdev write read block ...passed 00:31:25.749 Test: blockdev write zeroes read block ...passed 00:31:25.749 Test: blockdev write zeroes read no split ...passed 00:31:25.749 Test: blockdev write zeroes read split ...passed 00:31:25.749 Test: blockdev write zeroes read split partial ...passed 00:31:25.749 Test: blockdev reset ...passed 00:31:25.749 Test: blockdev write read 8 blocks ...passed 00:31:25.749 Test: blockdev write read size > 128k ...passed 00:31:25.749 Test: blockdev write read invalid size ...passed 00:31:25.749 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:25.749 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:25.749 Test: blockdev write read max offset ...passed 00:31:25.749 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:25.749 Test: blockdev writev readv 8 blocks ...passed 00:31:25.749 Test: blockdev writev readv 30 x 1block ...passed 00:31:25.749 Test: blockdev writev readv block ...passed 00:31:25.749 Test: blockdev writev readv size > 128k ...passed 00:31:25.749 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:25.749 Test: blockdev comparev and writev ...passed 00:31:25.749 Test: blockdev nvme passthru rw ...passed 00:31:25.749 Test: blockdev nvme passthru vendor specific ...passed 00:31:25.749 Test: blockdev nvme admin passthru ...passed 00:31:25.749 Test: blockdev copy ...passed 00:31:25.749 00:31:25.749 Run Summary: Type Total Ran Passed Failed Inactive 00:31:25.749 suites 4 4 n/a 0 0 00:31:25.749 tests 92 92 92 0 0 00:31:25.749 asserts 520 520 520 0 n/a 00:31:25.749 00:31:25.749 Elapsed time = 0.530 seconds 00:31:25.749 0 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 265671 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 265671 ']' 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 265671 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 265671 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 265671' 00:31:26.007 killing process with pid 265671 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 265671 00:31:26.007 09:35:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 265671 00:31:26.266 09:35:35 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:31:26.266 00:31:26.266 real 0m3.597s 00:31:26.266 user 0m9.924s 00:31:26.266 sys 0m0.550s 00:31:26.266 09:35:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:26.266 09:35:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:26.266 ************************************ 00:31:26.266 END TEST bdev_bounds 00:31:26.266 ************************************ 00:31:26.266 09:35:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:26.266 09:35:35 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:26.266 09:35:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:31:26.266 09:35:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:26.266 09:35:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:26.524 ************************************ 00:31:26.524 START TEST bdev_nbd 00:31:26.524 ************************************ 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=266100 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 266100 /var/tmp/spdk-nbd.sock 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 266100 ']' 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:26.524 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:26.524 09:35:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:26.524 [2024-07-15 09:35:35.319667] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:31:26.524 [2024-07-15 09:35:35.319734] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:26.524 [2024-07-15 09:35:35.451464] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:26.783 [2024-07-15 09:35:35.558098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.784 [2024-07-15 09:35:35.579398] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:26.784 [2024-07-15 09:35:35.587419] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:26.784 [2024-07-15 09:35:35.595437] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:26.784 [2024-07-15 09:35:35.705977] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:29.322 [2024-07-15 09:35:37.932400] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:29.322 [2024-07-15 09:35:37.932476] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:29.322 [2024-07-15 09:35:37.932491] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.322 [2024-07-15 09:35:37.940417] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:29.322 [2024-07-15 09:35:37.940437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:29.322 [2024-07-15 09:35:37.940449] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.322 [2024-07-15 09:35:37.948436] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:29.322 [2024-07-15 09:35:37.948454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:29.322 [2024-07-15 09:35:37.948466] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.322 [2024-07-15 09:35:37.956458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:29.322 [2024-07-15 09:35:37.956478] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:29.322 [2024-07-15 09:35:37.956489] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:29.322 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:29.892 1+0 records in 00:31:29.892 1+0 records out 00:31:29.892 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276272 s, 14.8 MB/s 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:29.892 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:29.893 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:30.152 1+0 records in 00:31:30.152 1+0 records out 00:31:30.152 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321802 s, 12.7 MB/s 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:30.152 09:35:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:30.152 1+0 records in 00:31:30.152 1+0 records out 00:31:30.152 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307581 s, 13.3 MB/s 00:31:30.152 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:30.441 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:30.701 1+0 records in 00:31:30.701 1+0 records out 00:31:30.701 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003531 s, 11.6 MB/s 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:30.701 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd0", 00:31:30.997 "bdev_name": "crypto_ram" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd1", 00:31:30.997 "bdev_name": "crypto_ram2" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd2", 00:31:30.997 "bdev_name": "crypto_ram3" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd3", 00:31:30.997 "bdev_name": "crypto_ram4" 00:31:30.997 } 00:31:30.997 ]' 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd0", 00:31:30.997 "bdev_name": "crypto_ram" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd1", 00:31:30.997 "bdev_name": "crypto_ram2" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd2", 00:31:30.997 "bdev_name": "crypto_ram3" 00:31:30.997 }, 00:31:30.997 { 00:31:30.997 "nbd_device": "/dev/nbd3", 00:31:30.997 "bdev_name": "crypto_ram4" 00:31:30.997 } 00:31:30.997 ]' 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:30.997 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:31.256 09:35:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:31.516 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:31.775 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:32.035 09:35:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:32.295 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:32.555 /dev/nbd0 00:31:32.555 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:32.555 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:32.555 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:32.555 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:32.556 1+0 records in 00:31:32.556 1+0 records out 00:31:32.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269553 s, 15.2 MB/s 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:32.556 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:31:32.818 /dev/nbd1 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:32.818 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:32.819 1+0 records in 00:31:32.819 1+0 records out 00:31:32.819 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317178 s, 12.9 MB/s 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:32.819 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:31:33.078 /dev/nbd10 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:33.078 1+0 records in 00:31:33.078 1+0 records out 00:31:33.078 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322579 s, 12.7 MB/s 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:33.078 09:35:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:31:33.338 /dev/nbd11 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:33.338 1+0 records in 00:31:33.338 1+0 records out 00:31:33.338 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261225 s, 15.7 MB/s 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:33.338 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd0", 00:31:33.597 "bdev_name": "crypto_ram" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd1", 00:31:33.597 "bdev_name": "crypto_ram2" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd10", 00:31:33.597 "bdev_name": "crypto_ram3" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd11", 00:31:33.597 "bdev_name": "crypto_ram4" 00:31:33.597 } 00:31:33.597 ]' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd0", 00:31:33.597 "bdev_name": "crypto_ram" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd1", 00:31:33.597 "bdev_name": "crypto_ram2" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd10", 00:31:33.597 "bdev_name": "crypto_ram3" 00:31:33.597 }, 00:31:33.597 { 00:31:33.597 "nbd_device": "/dev/nbd11", 00:31:33.597 "bdev_name": "crypto_ram4" 00:31:33.597 } 00:31:33.597 ]' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:33.597 /dev/nbd1 00:31:33.597 /dev/nbd10 00:31:33.597 /dev/nbd11' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:33.597 /dev/nbd1 00:31:33.597 /dev/nbd10 00:31:33.597 /dev/nbd11' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:33.597 256+0 records in 00:31:33.597 256+0 records out 00:31:33.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109543 s, 95.7 MB/s 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:33.597 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:33.857 256+0 records in 00:31:33.857 256+0 records out 00:31:33.857 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0609706 s, 17.2 MB/s 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:33.857 256+0 records in 00:31:33.857 256+0 records out 00:31:33.857 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0591049 s, 17.7 MB/s 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:31:33.857 256+0 records in 00:31:33.857 256+0 records out 00:31:33.857 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0608445 s, 17.2 MB/s 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:31:33.857 256+0 records in 00:31:33.857 256+0 records out 00:31:33.857 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0581623 s, 18.0 MB/s 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:33.857 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:34.116 09:35:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:34.376 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:34.635 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:34.893 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:35.152 09:35:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:35.152 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:35.152 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:35.152 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:35.411 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:35.670 malloc_lvol_verify 00:31:35.670 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:35.670 76a73573-8ac5-48f0-8df5-e6705cd30549 00:31:35.929 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:35.929 078ddd05-fd3e-4899-b77e-6b12a899fdda 00:31:36.188 09:35:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:36.188 /dev/nbd0 00:31:36.188 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:36.188 mke2fs 1.46.5 (30-Dec-2021) 00:31:36.188 Discarding device blocks: 0/4096 done 00:31:36.188 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:36.188 00:31:36.188 Allocating group tables: 0/1 done 00:31:36.188 Writing inode tables: 0/1 done 00:31:36.449 Creating journal (1024 blocks): done 00:31:36.449 Writing superblocks and filesystem accounting information: 0/1 done 00:31:36.449 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:36.449 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 266100 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 266100 ']' 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 266100 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 266100 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 266100' 00:31:36.708 killing process with pid 266100 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 266100 00:31:36.708 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 266100 00:31:36.968 09:35:45 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:36.968 00:31:36.968 real 0m10.636s 00:31:36.968 user 0m13.843s 00:31:36.968 sys 0m4.125s 00:31:36.968 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:36.968 09:35:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:36.968 ************************************ 00:31:36.968 END TEST bdev_nbd 00:31:36.968 ************************************ 00:31:37.229 09:35:45 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:37.229 09:35:45 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:37.229 09:35:45 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:37.229 09:35:45 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:37.229 09:35:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:37.229 ************************************ 00:31:37.229 START TEST bdev_fio 00:31:37.229 ************************************ 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:37.229 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:37.229 09:35:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:37.229 ************************************ 00:31:37.229 START TEST bdev_fio_rw_verify 00:31:37.229 ************************************ 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:37.229 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:37.488 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:37.488 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:37.488 09:35:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.748 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.748 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.748 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.748 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.748 fio-3.35 00:31:37.748 Starting 4 threads 00:31:52.639 00:31:52.639 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=268194: Mon Jul 15 09:35:59 2024 00:31:52.639 read: IOPS=27.1k, BW=106MiB/s (111MB/s)(1060MiB/10001msec) 00:31:52.639 slat (usec): min=10, max=435, avg=49.90, stdev=41.94 00:31:52.639 clat (usec): min=9, max=2739, avg=263.79, stdev=234.66 00:31:52.639 lat (usec): min=32, max=3094, avg=313.69, stdev=265.60 00:31:52.639 clat percentiles (usec): 00:31:52.639 | 50.000th=[ 198], 99.000th=[ 1254], 99.900th=[ 1647], 99.990th=[ 1811], 00:31:52.639 | 99.999th=[ 2474] 00:31:52.639 write: IOPS=29.7k, BW=116MiB/s (122MB/s)(1136MiB/9777msec); 0 zone resets 00:31:52.639 slat (usec): min=13, max=1309, avg=59.95, stdev=43.38 00:31:52.639 clat (usec): min=22, max=2291, avg=320.68, stdev=277.56 00:31:52.639 lat (usec): min=42, max=2502, avg=380.63, stdev=310.00 00:31:52.639 clat percentiles (usec): 00:31:52.639 | 50.000th=[ 249], 99.000th=[ 1467], 99.900th=[ 2057], 99.990th=[ 2212], 00:31:52.639 | 99.999th=[ 2278] 00:31:52.639 bw ( KiB/s): min=87120, max=159424, per=98.56%, avg=117287.47, stdev=5643.92, samples=76 00:31:52.639 iops : min=21780, max=39856, avg=29321.79, stdev=1410.99, samples=76 00:31:52.639 lat (usec) : 10=0.01%, 20=0.01%, 50=2.52%, 100=12.01%, 250=43.19% 00:31:52.639 lat (usec) : 500=29.34%, 750=6.57%, 1000=3.13% 00:31:52.639 lat (msec) : 2=3.17%, 4=0.08% 00:31:52.639 cpu : usr=99.65%, sys=0.00%, ctx=87, majf=0, minf=265 00:31:52.639 IO depths : 1=10.7%, 2=25.4%, 4=50.9%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:52.639 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:52.639 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:52.639 issued rwts: total=271338,290862,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:52.639 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:52.639 00:31:52.639 Run status group 0 (all jobs): 00:31:52.639 READ: bw=106MiB/s (111MB/s), 106MiB/s-106MiB/s (111MB/s-111MB/s), io=1060MiB (1111MB), run=10001-10001msec 00:31:52.639 WRITE: bw=116MiB/s (122MB/s), 116MiB/s-116MiB/s (122MB/s-122MB/s), io=1136MiB (1191MB), run=9777-9777msec 00:31:52.639 00:31:52.639 real 0m13.517s 00:31:52.639 user 0m45.463s 00:31:52.639 sys 0m0.504s 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:52.640 ************************************ 00:31:52.640 END TEST bdev_fio_rw_verify 00:31:52.640 ************************************ 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ad43bf6-6552-58b0-af97-3f22e5a48346"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ad43bf6-6552-58b0-af97-3f22e5a48346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "de7e663c-1e56-5715-b6c4-d8420e5c86d0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de7e663c-1e56-5715-b6c4-d8420e5c86d0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cccae811-5fac-577b-b846-25e035202f58"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cccae811-5fac-577b-b846-25e035202f58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3b93c472-4643-5a01-8400-4b0d2352a72c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b93c472-4643-5a01-8400-4b0d2352a72c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:52.640 crypto_ram2 00:31:52.640 crypto_ram3 00:31:52.640 crypto_ram4 ]] 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ad43bf6-6552-58b0-af97-3f22e5a48346"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ad43bf6-6552-58b0-af97-3f22e5a48346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "de7e663c-1e56-5715-b6c4-d8420e5c86d0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de7e663c-1e56-5715-b6c4-d8420e5c86d0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cccae811-5fac-577b-b846-25e035202f58"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cccae811-5fac-577b-b846-25e035202f58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3b93c472-4643-5a01-8400-4b0d2352a72c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b93c472-4643-5a01-8400-4b0d2352a72c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:52.640 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:52.641 ************************************ 00:31:52.641 START TEST bdev_fio_trim 00:31:52.641 ************************************ 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:52.641 09:35:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:52.641 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:52.641 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:52.641 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:52.641 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:52.641 fio-3.35 00:31:52.641 Starting 4 threads 00:32:04.906 00:32:04.906 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=270223: Mon Jul 15 09:36:12 2024 00:32:04.906 write: IOPS=38.9k, BW=152MiB/s (159MB/s)(1519MiB/10001msec); 0 zone resets 00:32:04.906 slat (usec): min=16, max=433, avg=57.06, stdev=23.11 00:32:04.906 clat (usec): min=52, max=1834, avg=265.20, stdev=133.62 00:32:04.906 lat (usec): min=68, max=1910, avg=322.25, stdev=143.77 00:32:04.906 clat percentiles (usec): 00:32:04.906 | 50.000th=[ 237], 99.000th=[ 652], 99.900th=[ 750], 99.990th=[ 832], 00:32:04.906 | 99.999th=[ 1565] 00:32:04.906 bw ( KiB/s): min=149872, max=207552, per=100.00%, avg=155806.32, stdev=4425.46, samples=76 00:32:04.906 iops : min=37468, max=51888, avg=38951.58, stdev=1106.36, samples=76 00:32:04.906 trim: IOPS=38.9k, BW=152MiB/s (159MB/s)(1519MiB/10001msec); 0 zone resets 00:32:04.906 slat (usec): min=5, max=102, avg=16.39, stdev= 6.59 00:32:04.906 clat (usec): min=45, max=1910, avg=249.43, stdev=107.04 00:32:04.906 lat (usec): min=54, max=1934, avg=265.82, stdev=109.37 00:32:04.906 clat percentiles (usec): 00:32:04.906 | 50.000th=[ 237], 99.000th=[ 529], 99.900th=[ 578], 99.990th=[ 742], 00:32:04.906 | 99.999th=[ 1614] 00:32:04.906 bw ( KiB/s): min=149872, max=207584, per=100.00%, avg=155807.16, stdev=4425.84, samples=76 00:32:04.906 iops : min=37468, max=51896, avg=38951.79, stdev=1106.46, samples=76 00:32:04.906 lat (usec) : 50=0.03%, 100=5.67%, 250=48.94%, 500=40.97%, 750=4.33% 00:32:04.906 lat (usec) : 1000=0.05% 00:32:04.906 lat (msec) : 2=0.01% 00:32:04.906 cpu : usr=99.60%, sys=0.00%, ctx=86, majf=0, minf=95 00:32:04.906 IO depths : 1=6.8%, 2=26.6%, 4=53.2%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:04.906 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:04.906 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:04.906 issued rwts: total=0,388737,388739,0 short=0,0,0,0 dropped=0,0,0,0 00:32:04.906 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:04.906 00:32:04.906 Run status group 0 (all jobs): 00:32:04.906 WRITE: bw=152MiB/s (159MB/s), 152MiB/s-152MiB/s (159MB/s-159MB/s), io=1519MiB (1592MB), run=10001-10001msec 00:32:04.906 TRIM: bw=152MiB/s (159MB/s), 152MiB/s-152MiB/s (159MB/s-159MB/s), io=1519MiB (1592MB), run=10001-10001msec 00:32:04.906 00:32:04.906 real 0m13.556s 00:32:04.906 user 0m45.816s 00:32:04.906 sys 0m0.483s 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:04.906 ************************************ 00:32:04.906 END TEST bdev_fio_trim 00:32:04.906 ************************************ 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:04.906 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:04.906 00:32:04.906 real 0m27.445s 00:32:04.906 user 1m31.466s 00:32:04.906 sys 0m1.196s 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:04.906 ************************************ 00:32:04.906 END TEST bdev_fio 00:32:04.906 ************************************ 00:32:04.906 09:36:13 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:04.906 09:36:13 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:04.906 09:36:13 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:04.906 09:36:13 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:04.906 09:36:13 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:04.906 09:36:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.906 ************************************ 00:32:04.906 START TEST bdev_verify 00:32:04.906 ************************************ 00:32:04.906 09:36:13 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:04.906 [2024-07-15 09:36:13.561853] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:04.906 [2024-07-15 09:36:13.561907] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid271919 ] 00:32:04.906 [2024-07-15 09:36:13.677559] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:04.906 [2024-07-15 09:36:13.788583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:04.906 [2024-07-15 09:36:13.788587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:04.906 [2024-07-15 09:36:13.809973] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:04.906 [2024-07-15 09:36:13.817997] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:04.906 [2024-07-15 09:36:13.826025] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:05.179 [2024-07-15 09:36:13.936534] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:07.715 [2024-07-15 09:36:16.150520] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:07.715 [2024-07-15 09:36:16.150603] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:07.715 [2024-07-15 09:36:16.150619] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.715 [2024-07-15 09:36:16.158535] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:07.715 [2024-07-15 09:36:16.158556] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:07.715 [2024-07-15 09:36:16.158568] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.715 [2024-07-15 09:36:16.166558] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:07.715 [2024-07-15 09:36:16.166578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:07.715 [2024-07-15 09:36:16.166590] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.715 [2024-07-15 09:36:16.174581] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:07.715 [2024-07-15 09:36:16.174598] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:07.715 [2024-07-15 09:36:16.174610] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.715 Running I/O for 5 seconds... 00:32:12.990 00:32:12.991 Latency(us) 00:32:12.991 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:12.991 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x0 length 0x1000 00:32:12.991 crypto_ram : 5.07 486.97 1.90 0.00 0.00 261166.00 2763.91 182361.04 00:32:12.991 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x1000 length 0x1000 00:32:12.991 crypto_ram : 5.08 493.04 1.93 0.00 0.00 257995.68 3675.71 181449.24 00:32:12.991 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x0 length 0x1000 00:32:12.991 crypto_ram2 : 5.08 491.47 1.92 0.00 0.00 258510.71 2749.66 169595.77 00:32:12.991 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x1000 length 0x1000 00:32:12.991 crypto_ram2 : 5.08 497.35 1.94 0.00 0.00 255474.82 5841.25 168683.97 00:32:12.991 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x0 length 0x1000 00:32:12.991 crypto_ram3 : 5.06 3822.75 14.93 0.00 0.00 33145.28 8320.22 30089.57 00:32:12.991 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x1000 length 0x1000 00:32:12.991 crypto_ram3 : 5.06 3848.68 15.03 0.00 0.00 32912.88 8092.27 30089.57 00:32:12.991 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x0 length 0x1000 00:32:12.991 crypto_ram4 : 5.07 3839.22 15.00 0.00 0.00 32915.29 2393.49 25872.47 00:32:12.991 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:12.991 Verification LBA range: start 0x1000 length 0x1000 00:32:12.991 crypto_ram4 : 5.07 3863.57 15.09 0.00 0.00 32706.64 3575.99 25872.47 00:32:12.991 =================================================================================================================== 00:32:12.991 Total : 17343.05 67.75 0.00 0.00 58573.74 2393.49 182361.04 00:32:12.991 00:32:12.991 real 0m8.293s 00:32:12.991 user 0m15.734s 00:32:12.991 sys 0m0.371s 00:32:12.991 09:36:21 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:12.991 09:36:21 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:12.991 ************************************ 00:32:12.991 END TEST bdev_verify 00:32:12.991 ************************************ 00:32:12.991 09:36:21 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:12.991 09:36:21 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:12.991 09:36:21 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:12.991 09:36:21 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:12.991 09:36:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:12.991 ************************************ 00:32:12.991 START TEST bdev_verify_big_io 00:32:12.991 ************************************ 00:32:12.991 09:36:21 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:13.250 [2024-07-15 09:36:21.951260] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:13.250 [2024-07-15 09:36:21.951328] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid273016 ] 00:32:13.250 [2024-07-15 09:36:22.081197] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:13.250 [2024-07-15 09:36:22.190783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:13.250 [2024-07-15 09:36:22.190789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:13.510 [2024-07-15 09:36:22.212150] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:13.510 [2024-07-15 09:36:22.220178] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:13.510 [2024-07-15 09:36:22.228206] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:13.510 [2024-07-15 09:36:22.344256] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:16.050 [2024-07-15 09:36:24.567970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:16.050 [2024-07-15 09:36:24.568065] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:16.050 [2024-07-15 09:36:24.568081] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.050 [2024-07-15 09:36:24.575983] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:16.050 [2024-07-15 09:36:24.576003] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:16.050 [2024-07-15 09:36:24.576015] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.050 [2024-07-15 09:36:24.584005] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:16.050 [2024-07-15 09:36:24.584023] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:16.050 [2024-07-15 09:36:24.584034] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.050 [2024-07-15 09:36:24.592030] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:16.050 [2024-07-15 09:36:24.592048] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:16.050 [2024-07-15 09:36:24.592060] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.050 Running I/O for 5 seconds... 00:32:16.618 [2024-07-15 09:36:25.529238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:16.618 [2024-07-15 09:36:25.529728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:16.618 [2024-07-15 09:36:25.529909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:16.618 [2024-07-15 09:36:25.530010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:16.618 [2024-07-15 09:36:25.530083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:16.618 [2024-07-15 09:36:25.530457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.531538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.531594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.531638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.531679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.532153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.532202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.532243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.532293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.532635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.534417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.618 [2024-07-15 09:36:25.534473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.534526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.534568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.535043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.535090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.535145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.535205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.535593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.536503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.536565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.536607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.536648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.537171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.537218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.537261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.537320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.537754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.539873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.540209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.541978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.542412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.543503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.543554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.543611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.543652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.544179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.544226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.544267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.544308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.544581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.545721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.545771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.545812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.545852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.546296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.546346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.546387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.546428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.546782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.547843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.547898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.547958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.548004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.548548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.548599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.548653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.548694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.549045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.550555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.550605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.550645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.550685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.551096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.551158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.551216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.551259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.551582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.552673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.552736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.552777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.552817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.553345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.553392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.553446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.553487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.553921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.555579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.555641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.555708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.555750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.556257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.556316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.556365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.556420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.556769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.557722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.557778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.557819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.557860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.558366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.558418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.558471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.558514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.558931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.560324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.619 [2024-07-15 09:36:25.560373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.560426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.560468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.560997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.561045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.561086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.561127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.561435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.562476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.562540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.562592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.562640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.563109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.563158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.563200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.563253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.563704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.564744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.564794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.564835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.564892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.565442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.565494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.565535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.565577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.565845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.567785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.568159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.569986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.620 [2024-07-15 09:36:25.570043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.570378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.571563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.571615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.571656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.571699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.572135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.572184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.572225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.572266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.572606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.573582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.573632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.573673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.573714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.574260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.574311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.574364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.574408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.574784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.576921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.577303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.578994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.579400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.580809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.580871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.580913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.580973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.581447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.581493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.581535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.581575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.581923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.582905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.582974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.583679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.584119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.585968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.586009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.586332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.587425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.587476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.587517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.587557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.588032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.588081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.588122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.588162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.588518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.589577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.589628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.589671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.589711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.590232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.590293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.590335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.590390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.590806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.591959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.881 [2024-07-15 09:36:25.592646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.592952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.593941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.593999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.594876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.596892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.597201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.598856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.599161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.600391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.600442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.600488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.600899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.600952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.601000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.602217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.603143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.604529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.606018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.606409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.608166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.608554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.608943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.611502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.612826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.614509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.616029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.617832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.618699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.619089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.619813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.622358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.623295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.624680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.626091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.628166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.628559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.628946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.630519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.632974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.634730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.636318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.637778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.639006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.639401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.640196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.641576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.643594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.644977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.646379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.647790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.648524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.648918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.650433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.651799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.654539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.656051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.657477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.658880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.659680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.660319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.661697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.663178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.665716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.667131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.668659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.670319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.671166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.672611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.673993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.675397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.677895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.679300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.680715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.882 [2024-07-15 09:36:25.681517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.682502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.683893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.685414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.687084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.689690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.691258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.692947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.693330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.695168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.696602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.698015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.699428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.701957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.703383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.704011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.704393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.706102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.707518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.709045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.710729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.713304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.714856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.715247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.715640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.717337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.718757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.720174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.721081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.723619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.724360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.724746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.725588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.727448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.729031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.730744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.732008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.734730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.735128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.735528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.737232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.739029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.740451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.741359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.742837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.744793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.745195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.746010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.747388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.749427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.751218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.752405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.753773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.755286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.755677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.757286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.758710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.760529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.761471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.763002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.764667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.766245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.767115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.768501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.769905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.772008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.773283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.774661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.776074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.777684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.779374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.780887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.782279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.783599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.785069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.786661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.788369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.790429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.791807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.793216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.794630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.796340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.797732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.799156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.800558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.804180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.805948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.807584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.809085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.810865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.812285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.813815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.815473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.818081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.883 [2024-07-15 09:36:25.819508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.820931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.822134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.823872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.825284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.826692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.827392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:16.884 [2024-07-15 09:36:25.830316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.832071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.833848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.835004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.836853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.838298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.839600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.840002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.842732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.844159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.845080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.846644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.848496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.849915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.850437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.850818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.853621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.855454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.856630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.858006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.859808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.861183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.861584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.861975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.864570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.865649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.867326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.869152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.870900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.871467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.871851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.872852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.875712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.876886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.878277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.879684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.881503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.881911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.882302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.884032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.886210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.887809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.889558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.891346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.892240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.892633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.893726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.895116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.897320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.898724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.900141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.901552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.902306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.902699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.146 [2024-07-15 09:36:25.904258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.905903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.908520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.910062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.911712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.913476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.914347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.915756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.917149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.918548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.921070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.922485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.923791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.925103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.925992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.926790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.927952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.928575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.930144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.930546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.930949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.931338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.932089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.932490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.932879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.933275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.935111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.935510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.935900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.936294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.937148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.937549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.937951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.938335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.940089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.940484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.940874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.940916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.941318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.941771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.942276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.942674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.943074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.943462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.944917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.944988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.945031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.945072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.945499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.946000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.946056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.946111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.946152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.947676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.947727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.947768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.947809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.948149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.948298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.948371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.948425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.948467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.949850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.949900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.949949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.949992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.950356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.950509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.950553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.950595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.950635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.952973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.954386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.147 [2024-07-15 09:36:25.954448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.954504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.954558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.954912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.955073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.955135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.955178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.955223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.956663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.956713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.956754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.956795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.957173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.957324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.957370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.957411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.957452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.958812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.958873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.958936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.958979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.959326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.959472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.959516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.959558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.959600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.961905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.963998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.964041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.964083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.965484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.965535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.965577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.965619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.966083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.966226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.966272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.966325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.966366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.967845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.967897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.967946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.967987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.968311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.968463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.968511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.968579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.968623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.970841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.972974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.973026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.148 [2024-07-15 09:36:25.973080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.973122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.974584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.974645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.974702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.974757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.975076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.975225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.975283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.975325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.975366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.976680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.976732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.976773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.976814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.977177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.977324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.977382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.977424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.977465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.978965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.979779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.981987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.983397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.983448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.983506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.983547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.983963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.984110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.984154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.984195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.984237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.985620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.985672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.985714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.985754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.986132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.986282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.986326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.986366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.986423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.988916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.990889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.992296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.992350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.992391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.992432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.149 [2024-07-15 09:36:25.992766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.992914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.992980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.993031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.993076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.994987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.995031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.995075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.996972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.997012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.998954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.999002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:25.999044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.000901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.002872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.004861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.006766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.008910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.010063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.010114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.010156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.150 [2024-07-15 09:36:26.010202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.010464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.010612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.010656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.010697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.010737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.011897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.011959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.012363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.012734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.012878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.012923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.012976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.013017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.014180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.016000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.017038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.018420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.018731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.018878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.020532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.021509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.021895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.024552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.026225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.026901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.028302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.028570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.030434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.032135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.032521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.032905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.035664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.036951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.038455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.039826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.040146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.041897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.042628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.043020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.043665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.046413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.047120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.048489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.049899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.050175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.051848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.052247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.052633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.054175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.056499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.058161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.059681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.061088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.061357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.062076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.151 [2024-07-15 09:36:26.062469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.063255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.064639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.066568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.067978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.069386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.071038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.071311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.071794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.072204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.074013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.075677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.078572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.080359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.082007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.083778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.084152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.084636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.085776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.087154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.088548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.090979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.092364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.094024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.094883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.152 [2024-07-15 09:36:26.095294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.095903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.097306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.098817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.100481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.103043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.104703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.106110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.106499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.106954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.108725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.110306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.111785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.113570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.116413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.118237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.118626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.119014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.119283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.120742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.122150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.123797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.124778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.127627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.128441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.128829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.129443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.129713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.131470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.133236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.134888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.136006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.138718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.139122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.139510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.141073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.141387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.142892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.144554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.145243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.146772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.148503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.148900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.149761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.151153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.151425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.153189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.154542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.155948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.157332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.158814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.159218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.160965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.162497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.162803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.164564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.165246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.166667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.168226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.169786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.170765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.172140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.173533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.173801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.175241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.176703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.178078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.179474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.181126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.182912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.184612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.186203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.186475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.187250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.188639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.190054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.191697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.193967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.195364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.196759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.198415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.198733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.200533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.202147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.203611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.413 [2024-07-15 09:36:26.205261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.208145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.209777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.211535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.213191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.213511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.214993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.216426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.218073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.219177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.221998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.223396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.225042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.225779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.226059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.227854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.229466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.231199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.231593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.234180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.235833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.237247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.238615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.238951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.240456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.242118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.242983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.243370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.245993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.247655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.248325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.249715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.249991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.251783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.253307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.253694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.254081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.256885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.257979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.259673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.261206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.261515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.263239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.263827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.264216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.264997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.267739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.268576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.269947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.271334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.271602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.273210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.273601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.273991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.275596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.277863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.279536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.281052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.282504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.282776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.283405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.283793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.284675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.286054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.288118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.289517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.290917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.292565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.292839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.293339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.293732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.295132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.296533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.298859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.299259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.299647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.301251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.301522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.303028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.303669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.305055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.306687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.308351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.308750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.309147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.309536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.309898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.310394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.310792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.311184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.311575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.313281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.313680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.314077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.314474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.314863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.315360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.315756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.316157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.316543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.414 [2024-07-15 09:36:26.318262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.318660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.319064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.319457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.319905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.320455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.320857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.321257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.321644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.323415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.323809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.324208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.324595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.325022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.325509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.325900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.326298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.326687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.328358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.328755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.329150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.329540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.329917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.330420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.330807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.331212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.331603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.333536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.333937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.334322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.334369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.334739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.335232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.335623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.336013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.336401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.338103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.338167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.338222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.338266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.338597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.339097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.339152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.339194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.339235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.340675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.340727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.340768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.340810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.341290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.341435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.341480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.341521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.341562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.342902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.342964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.343670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.345903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.347450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.347500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.347542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.347583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.347887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.348048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.348100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.348142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.348183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.349561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.349613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.349675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.349717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.350150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.350307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.350353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.350407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.350450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.351813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.351865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.351941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.351995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.352343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.352494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.352540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.352583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.352623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.354108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.415 [2024-07-15 09:36:26.354161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.354941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.356981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.357033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.357074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.357137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.358582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.358643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.358687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.358741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.359168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.359318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.359363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.359407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.359449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.360817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.360875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.360917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.360968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.361363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.361515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.361561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.361617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.361670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.362870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.362942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.362986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.363042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.416 [2024-07-15 09:36:26.363310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.363463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.363511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.363552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.363593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.364766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.364820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.364861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.364934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.365200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.365350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.365411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.365468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.365520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.366794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.366844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.366886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.366936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.367384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.367529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.367577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.367621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.367662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.368831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.368881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.368922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.368972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.369354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.675 [2024-07-15 09:36:26.369501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.369545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.369586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.369640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.370772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.370824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.370865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.370905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.371217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.371370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.371419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.371466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.371507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.372667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.372724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.372766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.372807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.373141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.373287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.373335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.373377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.373418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.374570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.374621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.374661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.374702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.375035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.375184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.375227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.375268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.375330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.376453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.376503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.376544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.376584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.376892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.377053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.377099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.377139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.377180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.378481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.378537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.378577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.378618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.378919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.379073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.379118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.379165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.379206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.380415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.380465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.380505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.380546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.380860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.381014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.381063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.381103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.381150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.382996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.383037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.383078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.384989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.386923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.388868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.676 [2024-07-15 09:36:26.390597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.390640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.390683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.390735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.391946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.391995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.392725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.393880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.393937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.393993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.394573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.395776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.395827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.395868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.395908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.396194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.396338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.396396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.396439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.396480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.397839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.397889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.397946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.397993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.398261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.398405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.398473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.398514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.398555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.399719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.399769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.399810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.399850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.400146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.400290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.400341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.400383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.400429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.401672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.401724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.401766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.402723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.403896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.404933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.406310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.407711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.407989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.408133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.409389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.409786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.410180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.412784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.413707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.415104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.416676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.416948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.418636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.419037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.419427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.420579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.423371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.424856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.426228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.427622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.427951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.429135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.429527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.429950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.431473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.433601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.434988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.436425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.438018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.438289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.438777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.439180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.440472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.441860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.444514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.445959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.447354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.448759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.449076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.449564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.677 [2024-07-15 09:36:26.450174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.451547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.453055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.455586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.456999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.458400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.459889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.460320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.460806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.462355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.463735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.465121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.467836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.469376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.470776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.471328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.471653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.472511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.473885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.475273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.476735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.479322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.480745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.481783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.482188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.482485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.483541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.484919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.486335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.487728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.490303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.491712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.492833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.493226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.493667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.495586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.497288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.498855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.500291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.503217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.504924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.505325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.505713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.505990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.507450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.508862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.510283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.511591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.514225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.515256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.515648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.516061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.516329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.518215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.519898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.521444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.522373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.525295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.525700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.526094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.527451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.527769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.529337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.530755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.531683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.533151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.534969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.535374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.536304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.537680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.537969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.539522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.541071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.542558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.543938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.545510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.545912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.547739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.549385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.549715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.551219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.552123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.553489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.555014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.556576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.557542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.558917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.560322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.560591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.562285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.563683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.565053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.566439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.568221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.570002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.571610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.573099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.573404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.574408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.678 [2024-07-15 09:36:26.575794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.577275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.578895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.581119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.582503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.583900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.585306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.585575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.587217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.588634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.590044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.591438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.594540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.596203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.597983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.599687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.600033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.601529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.602950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.604356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.605717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.608481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.609890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.611293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.612222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.612491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.614352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.615991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.617473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.617938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.620587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.622151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.623801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.625173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.679 [2024-07-15 09:36:26.625509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.627018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.628439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.629412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.629803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.632532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.633983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.634836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.636210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.636481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.638180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.639879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.640275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.640664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.643090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.644567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.645945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.647350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.647681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.648179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.648575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.649326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.650518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.652188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.653476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.654265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.655689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.656014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.657139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.658889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.659285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.659672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.661439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.661835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.662236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.662631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.663077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.663564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.663968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.664359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.664748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.666450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.666848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.667248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.667640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.668040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.668541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.668947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.669337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.669728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.671386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.671792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.672191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.672584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.672958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.673450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.673846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.674243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.674635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.676335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.676735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.677142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.677534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.677817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.678312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.678725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.679124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.679515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.681298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.681698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.682098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.682489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.682807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.683300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.683694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.684088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.684482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.686300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.686701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.687100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.939 [2024-07-15 09:36:26.687489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.687863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.688363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.688759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.689159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.689545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.691344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.691750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.692151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.692540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.692900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.693391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.693787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.694182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.694573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.696318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.696732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.697130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.697520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.697905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.698393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.698784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.699184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.699572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.701319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.701716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.702121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.702172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.702524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.703019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.703426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.703818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.704219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.706064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.706122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.706164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.706232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.706706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.707219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.707274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.707334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.707386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.709964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.710022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.710074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.710129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.711400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.711481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.711533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.711574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.711880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.712042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.712092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.712135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.712192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.713683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.713739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.713802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.713844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.714117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.714268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.714312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.714352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.714393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.715558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.715615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.715661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.715702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.715975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.716123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.716171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.716212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.716253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.717602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.717665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.717706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.717747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.718080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.718228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.718272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.718313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.718353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.719658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.719713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.719754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.719803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.720120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.720269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.720313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.720353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.720409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.721755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.721806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.721846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.721886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.722196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.722347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.722396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.722436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.722476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.724985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.725026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.726939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.728921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.730842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.732894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.734764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.736706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.738725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.739891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.739950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.740608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.741850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.741917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.741978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.742019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.742381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.742529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.940 [2024-07-15 09:36:26.742574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.742614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.742655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.743869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.743919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.743966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.744623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.745862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.745912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.745958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.746649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.747961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.748696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.749922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.749977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.750651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.751870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.751920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.751967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.752555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.753746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.753797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.753838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.753878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.754152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.754297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.754366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.754407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.754447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.755626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.755677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.755718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.755773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.756206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.756354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.756402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.756443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.756484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.757646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.757695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.757736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.757776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.758047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.758195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.758240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.758280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.758321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.759646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.759696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.759737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.759777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.760052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.760201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.760246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.760287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.760328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.761786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.761836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.761877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.761917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.762228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.762373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.762417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.762457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.762509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.763783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.763833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.763873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.763915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.764226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.764370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.764418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.764458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.764507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.765767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.765817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.765859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.765900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.766281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.766430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.766475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.766515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.766565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.767784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.767842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.767888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.767935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.768198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.768345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.768389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.768430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.768477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.769695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.769752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.769799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.771964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.773289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.774937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.776711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.777624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.777964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.778111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.779492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.781138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.782290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.785103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.786513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.788170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.788838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.789113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.791015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.792725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.794542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.794937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.797513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.799167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.800522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.801978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.802296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.803790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.805443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.806253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.806643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.809326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.810969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.811629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.813009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.813277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.815179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.816853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.817242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.817633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.820435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.821754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.941 [2024-07-15 09:36:26.823183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.824532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.824849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.826609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.827420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.827813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.828203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.831046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.831706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.833087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.834543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.834812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.836550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.836946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.837337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.838476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.840983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.842453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.843828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.845237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.845508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.846400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.846796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.847186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.848773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.850612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.852002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.853408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.855050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.855320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.855804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.856209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.857489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.858849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.861588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.863004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.864387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.866031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.866382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.866872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.867344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.868823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.870472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.873086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.874494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.876138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.877520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.877911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.878405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.879821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.881208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.882603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.885307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.886753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.888413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.888810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.889120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:17.942 [2024-07-15 09:36:26.890003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.891391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.892796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.894455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.897052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.898707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.899489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.899884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.900283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.902049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.903886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.201 [2024-07-15 09:36:26.905512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.907223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.910239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.911816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.912214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.912604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.912876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.914348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.915750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.917411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.918404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.921301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.921941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.922337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.922868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.923145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.924962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.926789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.928489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.929523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.932082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.932483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.932873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.934355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.934659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.936166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.937823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.938558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.940184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.941886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.942290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.942942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.944448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.944728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.946488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.947829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.949226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.950590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.952157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.952559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.954374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.956007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.956332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.958097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.958776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.960158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.961593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.963201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.964248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.965627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.967033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.967304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.968657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.970130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.971499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.972913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.974908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.975316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.975713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.976111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.976495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.977466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.978859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.980038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.981097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.982700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.984090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.985452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.986834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.987149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.988608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.990000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.991373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.992236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.994016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.994418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.994810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.995208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.995567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.996065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.996461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.996848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.997246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.999557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:26.999964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.000359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.000749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.001110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.001600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.002009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.002400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.002789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.004881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.005301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.005692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.006086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.006486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.006981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.007376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.202 [2024-07-15 09:36:27.007764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.008156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.009965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.010365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.010756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.011158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.011600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.012095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.012490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.012880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.013278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.014975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.015370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.015756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.016154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.016496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.016993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.017385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.017776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.018173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.019946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.020987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.021908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.023140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.023479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.023973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.024374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.025215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.026315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.027915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.028526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.029875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.030686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.030964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.031453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.031848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.032248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.033830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.035871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.036282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.038021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.038417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.038689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.039188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.039601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.039998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.041621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.043399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.043799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.045207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.045748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.046032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.046528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.046921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.047319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.048659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.050294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.050695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.051744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.052656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.052940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.053822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.054226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.054615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.055463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.057069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.057470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.058026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.059425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.059798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.061188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.061583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.061990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.062400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.064260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.064660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.065056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.065108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.065380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.065865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.067428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.067829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.068220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.071349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.071407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.071450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.071491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.071834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.073591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.073652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.073696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.073736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.074991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.203 [2024-07-15 09:36:27.075747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.075787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.077858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.079827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.081971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.082012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.083981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.084022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.084063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.085957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.086006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.087972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.088017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.088058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.088099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.089538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.089589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.089630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.089670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.089965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.090112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.090159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.090199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.090239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.091420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.091481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.091522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.091563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.091863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.092017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.092062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.092103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.092151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.093321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.093372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.093415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.093455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.093882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.094041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.094087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.094129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.094171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.095998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.096038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.096087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.097963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.098013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.204 [2024-07-15 09:36:27.098056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.099998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.100049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.100090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.101962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.102010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.103328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.103378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.103421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.103462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.103893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.104045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.104089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.104136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.205 [2024-07-15 09:36:27.104176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.464 [2024-07-15 09:36:27.273824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.273908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.274435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.274494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.275826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.276110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.277864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.279088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.280393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.281759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.283237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.283643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.285467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.287177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.287464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.289380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.290187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.291575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.293229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.294769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.296118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.297508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.299165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.299443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.300154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.301799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.303616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.305305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.307309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.308710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.310381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.312035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.312349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.313797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.315191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.316745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.318405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.322266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.324024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.325774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.327372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.327694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.464 [2024-07-15 09:36:27.329140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.330786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.332439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.333440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.336180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.337839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.339505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.340024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.340302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.342085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.343897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.345547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.345941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.348826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.349906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.351249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.352636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.352915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.353721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.354132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.354527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.354933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:18.465 [2024-07-15 09:36:27.357911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.359306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.360888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.361322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.361599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.363446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.365192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.365577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.365967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.368588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.369464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.370834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.372498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.372770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.373270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.373662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.374584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.375587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.377387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.377787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.378189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.378582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.378907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.379411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.379805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.380207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.380596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.382291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.382699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.383094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.383489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.383895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.384399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.384793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.385190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.385582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.387306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.387701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.388094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.388486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.388854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.389378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.389766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.390166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.390555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.392299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.392692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.393084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.393473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.393848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.394354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.394742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.395144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.395536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.397228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.397634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.398036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.398087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.398551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.399055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.399451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.399838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.400234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.402014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.402070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.402455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.402499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.402822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.403319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.403712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.404103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.404486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.406488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.406546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.406950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.407004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.407406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.465 [2024-07-15 09:36:27.407559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.407958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.408003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.408387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.410340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.410406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.410798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.410846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.411242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.411422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.411820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.411867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.412264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.414166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.414231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.414620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.414691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.415068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.415222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.415622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.415669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.466 [2024-07-15 09:36:27.416067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.418057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.418123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.418509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.418564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.418911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.419078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.419468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.419511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.419903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.421846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.421913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.422313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.422378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.422792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.422960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.423350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.423393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.423778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.425610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.425694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.426091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.426142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.426557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.426714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.427113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.427158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.427543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.430298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.430356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.432160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.432206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.432564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.432718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.433120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.433165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.434745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.437368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.437426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.438994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.439040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.439312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.439468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.439858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.439902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.440307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.443330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.443395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.443780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.443821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.444184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.444338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.445300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.445351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.446985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.449197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.449271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.449652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.449695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.450081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.450235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.451503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.451553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.452823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.454369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.454426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.454810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.454860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.455135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.455294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.456952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.456998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.727 [2024-07-15 09:36:27.458283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.460946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.461005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.461390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.461434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.461909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.462071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.463825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.463870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.465579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.468305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.468362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.470005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.470051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.470321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.470471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.471370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.471434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.471820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.474691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.474755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.476377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.476425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.476741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.476898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.478272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.478319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.479944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.481663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.481722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.483212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.483258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.483529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.483681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.485336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.485384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.486304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.489097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.489155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.489672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.489716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.490071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.490223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.491203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.491251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.492622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.495020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.495076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.496491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.496537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.496809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.496973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.497023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.497064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.497105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.499521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.499578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.500952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.500999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.501269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.501423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.501484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.501527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.501568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.502770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.502821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.502862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.502902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.503182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.503338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.503382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.503439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.503483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.504778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.504831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.504873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.504914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.505192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.505349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.505399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.505447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.505487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.506632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.506683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.506725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.506767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.507047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.507199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.507250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.507291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.507337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.508454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.508505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.508548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.508590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.509053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.509224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.509269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.509310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.509351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.728 [2024-07-15 09:36:27.510530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.510589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.510630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.510670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.510948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.511110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.511155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.511197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.511798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.512941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.512992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.513689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.515704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.517700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.519792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.520913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.520974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.521607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.522729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.522787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.522831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.522873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.523302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.523448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.523496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.523549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.523591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.524859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.524941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.524983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.525570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.528521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.528579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.528628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.530934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.532297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.533992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.534060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.535659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.536012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.536161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.536206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.536247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.537354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.538183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.538227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.538613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.538976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.539129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.540113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.540159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.541525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.542643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.544144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.729 [2024-07-15 09:36:27.544192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.545723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.546002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.546152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.547909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.547965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.548343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.549533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.551162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.551210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.552832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.553202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.553355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.555121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.555166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.556883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.558151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.558548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.558592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.559771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.560104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.560257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.561896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.561951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.563569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.564921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.566566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.566619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.568131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.568532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.568684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.569086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.569134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.570880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.572016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.572472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.572518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.573944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.574215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.574364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.576026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.576075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.577141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.578501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.579893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.579959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.581584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.581859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.582020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.582729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.582776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.584164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.585283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.585680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.585725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.586130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.586404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.586557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.588267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.588322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.589972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.591238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.592878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.592932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.594563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.594854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.595012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.595419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.595464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.595847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.596983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.598766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.598814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.599685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.600019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.600170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.601809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.601856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.603499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.605110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.606846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.606892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.608594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.608866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.609024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.610431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.610477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.611710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.612860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.613829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.613890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.614280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.614682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.614836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.616264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.616312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.617952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.619192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.620548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.620596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.730 [2024-07-15 09:36:27.622234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.622506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.622656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.623283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.623330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.623714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.624897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.626545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.626593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.627916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.628227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.628377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.629768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.629816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.631433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.632806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.633313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.633361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.634737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.635018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.635167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.636820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.636868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.637750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.638936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.640594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.640641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.641030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.641404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.641556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.642472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.642521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.643899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.645022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.646474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.646523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.648010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.648282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.648431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.650254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.650643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.651032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.652142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.653799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.653955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.654760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.656149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.657778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.659428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.660618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.661976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.663357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.665000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.665274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.666014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.667762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.669469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.671263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.673180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.674575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:18.731 [2024-07-15 09:36:27.676234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.049 [2024-07-15 09:36:27.677868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.678160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.679413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.680794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.682434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.684072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.687163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.688685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.689111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.690892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.691182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.691677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.692073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.692462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.693977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.695664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.696081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.697713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.699033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.699305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.700058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.701831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.703377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.705028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.708291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.710124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.710522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.712089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.712364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.712854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.713254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.713644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.715268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.718127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.718663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.719054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.719441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.719810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.719969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.720369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.720757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.721151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.722833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.723244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.723636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.724029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.724381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.724871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.725281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.725673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.726060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.727811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.728221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.728614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.729016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.729463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.729970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.730363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.730752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.731144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.732931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.733326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.733715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.734115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.734505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.735004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.735396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.735790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.736188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.738268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.738661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.739049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.739437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.739807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.740305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.740696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.741092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.050 [2024-07-15 09:36:27.741484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.743413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.743822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.744231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.744614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.744984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.745478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.745876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.746272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.746661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.748338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.748734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.749138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.749523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.749854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.750358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.750754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.751148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.751532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.753241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.753636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.753686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.754082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.754460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.754960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.755355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.755746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.756139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.757864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.757922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.758314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.758357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.758644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.759145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.759545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.759940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.760335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.762142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.762204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.762596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.762644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.763031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.763530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.763583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.763980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.764037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.766047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.051 [2024-07-15 09:36:27.766109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.766502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.766557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.766956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.768572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.768625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.769553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.769599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.772463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.772520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.773892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.773943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.774260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.776048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.776105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.777747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.777793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.780997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.781057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.782606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.782655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.782999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.784487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.784541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.785567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.785616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.787644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.787703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.789085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.789131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.789510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.791317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.791392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.791960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.792006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.794929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.794992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.796686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.796731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.797054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.798800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.798854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.800503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.800548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.803499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.803581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.803972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.804026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.804299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.805067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.805122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.806296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.806346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.807971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.808035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.809609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.809655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.809929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.811511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.811566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.812544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.812593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.815340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.815404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.815937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.815983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.816254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.816994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.817048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.818426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.052 [2024-07-15 09:36:27.818471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.820844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.820902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.822272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.822318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.822620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.824379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.824434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.825562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.825608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.827865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.827922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.829306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.829355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.829685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.831444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.831497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.832134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.832180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.835019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.835082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.836462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.836507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.836903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.838773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.838850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.839244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.839290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.842067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.842143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.843015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.843061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.843369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.844865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.844918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.846554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.846600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.849666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.849722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.850327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.850375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.850672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.852183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.852246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.853883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.853935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.856519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.856574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.858209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.858254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.858654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.859956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.860008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.861265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.053 [2024-07-15 09:36:27.861311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.863902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.863962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.865592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.865638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.865966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.867851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.867910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.869659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.869703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.871932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.871986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.873394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.873441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.873855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.875478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.875531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.875578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.875621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.877374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.877436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.878809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.878854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.879150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.879302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.879348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.879388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.879429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.881828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.883733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.884935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.884985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.885625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.886819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.886893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.886941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.886982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.887336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.887482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.887526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.887567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.887607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.888776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.888827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.888867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.888908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.889269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.054 [2024-07-15 09:36:27.889418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.889462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.889503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.889544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.890716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.890767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.890807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.890848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.891240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.892800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.892851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.892891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.892942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.894810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.896772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.898843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.900830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.902722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.903870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.903921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.903968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.904009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.904281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.904430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.904475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.055 [2024-07-15 09:36:27.904515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.904563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.905897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.905954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.907992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.908033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.909237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.910886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.910938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.911551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.911908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.912064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.912111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.912152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.912194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.913360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.915012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.915060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.915866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.916141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.916290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.918105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.918153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.919908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.921200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.922381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.922429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.923784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.924097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.924248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.925825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.925873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.926440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.927617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.929272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.929328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.930522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.930853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.931010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.932598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.932648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.056 [2024-07-15 09:36:27.933054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.934153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.935804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.935851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.936624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.936895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.937051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.938815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.938864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.940518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.057 [2024-07-15 09:36:27.941728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.318 [2024-07-15 09:36:27.943174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.318 [2024-07-15 09:36:27.943224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.943614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.943887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.944044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.945722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.945776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.947508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.951029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.952639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.952690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.954118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.954522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.954670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.956501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.956551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.956943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.958137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.959794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.959842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.960645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.960954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.961103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.962478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.962526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.964167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.965362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.967091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.967138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.967557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.967827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.967989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.969509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.969557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.971191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.974659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.975947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.975994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.977713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.978156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.978306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.979644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.979690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.980488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.985031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.986405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.986455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.987851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.988129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.988278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.989191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.989240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.990265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.994091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.995736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.995784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.997024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.997299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.997448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.998885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:27.998940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.000409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.002907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.003642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.003690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.005052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.005330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.005480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.007124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.007171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.008144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.012367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.013834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.013888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.014856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.015139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.015291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.016363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.016410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.017780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.021721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.023114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.023162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.024777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.025128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.025277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.026427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.026475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.027731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.031263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.032684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.032730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.034205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.034529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.034677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.036038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.036085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.037722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.319 [2024-07-15 09:36:28.042239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.043619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.043668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.045072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.045345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.045497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.046623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.046670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.048440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.051979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.052804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.052850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.053799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.054082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.054232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.055599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.056953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.058583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.062287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.063329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.063377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.064856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.065246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.066706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.067236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.068659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.070244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.074801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.076605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.078275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.079475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.079819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.081272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.081841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.083286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.084654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.090038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.091692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.092523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.094110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.094492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.096038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.096607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.097990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.099551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.103419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.104851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.105246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.106634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.106989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.108277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.109009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.109397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.109905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.114260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.115913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.116680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.118059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.118395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.119746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.120497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.121857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.123249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.124986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.126346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.127152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.128298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.128595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.130281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.132071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.132466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.134102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.135781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.137196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.137759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.139149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.139582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.139735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.141153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.141741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.143116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.148047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.148454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.150239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.150624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.151002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.151494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.152920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.153401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.155131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.158410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.159385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.159775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.160184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.160535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.162077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.162802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.164030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.164422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.166977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.320 [2024-07-15 09:36:28.167377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.167765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.169059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.169440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.170966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.171456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.171843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.172241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.176406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.176805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.178403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.178978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.179252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.179743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.180146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.180537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.182286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.186500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.187492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.188691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.189445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.189758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.190260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.191081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.192181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.193485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.196155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.197464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.197858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.198261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.198627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.200515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.200941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.202469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.202856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.205996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.206390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.206443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.206829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.207173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.208306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.209325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.210267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.210652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.213022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.213085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.213472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.213536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.213911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.215853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.216292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.217815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.218208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.221135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.221192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.221579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.221626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.222007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.223549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.223611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.224008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.224055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.226795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.226852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.228496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.228549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.228994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.229499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.229552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.229948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.229998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.232611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.232671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.234089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.234135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.234576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.236436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.236497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.236888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.236941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.239845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.239901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.240294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.240343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.240699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.242284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.242337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.242730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.242777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.245616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.245673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.247255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.247301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.247753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.248261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.248316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.248702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.248751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.252331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.321 [2024-07-15 09:36:28.252388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.253780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.253826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.254102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.254807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.254859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.256235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.256295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.261734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.261791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.262184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.262228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.262500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.263746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.263802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.264424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.264471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.267052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.267110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.268218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.322 [2024-07-15 09:36:28.268265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.581 [2024-07-15 09:36:28.268670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.581 [2024-07-15 09:36:28.269181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.581 [2024-07-15 09:36:28.269240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.269630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.269680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.273898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.273964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.275359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.275405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.275722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.277561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.277624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.279390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.279433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.282740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.282796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.283724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.283774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.284092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.285208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.285268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.286400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.286447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.291081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.291139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.292508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.292554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.292829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.294680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.294733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.295418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.295468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.299876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.299941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.301319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.301363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.301636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.302452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.302507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.303792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.303835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.309127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.309184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.310479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.310523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.310793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.312244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.312298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.313784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.313829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.317127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.317183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.318551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.318598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.318907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.320660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.320713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.321597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.321653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.327439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.327494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.328430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.328476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.328821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.330050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.330102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.331472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.331517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.337242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.337299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.338950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.338996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.339408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.340563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.340616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.341982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.342027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.347110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.347165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.348949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.349003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.349278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.350946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.351007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.351055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.351096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.355933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.355991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.357970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.358015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.362881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.362956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.363017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.582 [2024-07-15 09:36:28.363064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.363334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.363491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.363535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.363576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.363617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.367809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.372838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.376954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.377000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.377041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.377089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.381995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.382042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.382082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.382123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.385773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.385824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.385864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.385905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.386181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.387164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.387218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.387260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.387301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.391944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.391996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.392817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.396973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.397015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.397055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.401567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.401622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.401673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.401720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.402108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.402260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.402306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.402348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.402390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.406771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.411863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.583 [2024-07-15 09:36:28.415480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.415532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.415573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.415613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.415968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.416123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.416167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.416208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.416248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.420653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.420705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.421723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.425992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.427188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.427240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.428611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.428923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.429082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.429127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.429169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.429209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.431931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.432921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.432974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.434341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.434650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.434801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.436456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.436504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.437174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.441545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.442782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.442828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.443963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.444302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.444455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.445755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.445802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.447189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.451578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.453003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.453052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.454770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.455149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.455298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.456143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.456190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.457695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.461626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.462558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.462605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.464061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.464403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.464553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.465963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.466010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.467643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.472309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.473682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.473729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.475126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.475397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.475546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.476773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.476818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.478495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.482131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.482674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.482721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.483951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.484265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.484415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.485788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.485835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.487237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.491142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.492791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.492840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.493647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.493921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.494076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.494885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.494938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.495886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.500187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.501378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.501425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.502796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.503110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.584 [2024-07-15 09:36:28.503259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.504915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.504966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.505765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.507976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.509415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.509462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.511114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.511388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.511540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.513115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.513161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.514727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.519531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.520778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.520825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.521737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.522070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.522222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.523621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.523668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.525309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.529432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.530259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.530310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.531397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.531676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.531825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.532535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.585 [2024-07-15 09:36:28.532586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.534022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.537776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.539335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.539384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.540994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.541270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.541421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.542565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.542612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.543233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.546459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.548119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.548168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.548830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.549106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.549259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.550900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.550959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.552598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.555769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.557097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.557145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.558518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.558849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.559005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.560654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.560703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.561432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.566022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.566923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.566975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.568392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.568827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.568986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.570558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.570609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.572192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.576831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.578429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.578479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.580208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.580534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.580686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.581334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.581380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.583050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.587097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.587912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.587973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.589512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.589787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.589944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.591678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.593295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.594541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.597334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.598031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.598080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.599133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.599410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.600491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.601035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.602433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.603998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.607595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.608749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.609368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.610742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.611093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.612591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.614229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.847 [2024-07-15 09:36:28.615064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.616796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.620598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.622013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.622400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.623748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.624094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.625294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.626116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.627687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.628272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.631914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.633490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.634917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.636562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.636913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.638311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.639206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.640266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.641139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.646364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.647922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.648537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.649883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.650218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.651709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.652135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.653658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.654172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.656509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.658274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.658672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.660363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.660775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.661280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.661677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.663165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.663638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.667017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.668046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.668986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.669372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.669732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.669884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.670654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.671857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.672810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.675215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.676916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.677308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.677700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.678068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.679673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.680084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.681876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.682279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.685769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.686175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.686565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.687257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.687532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.688405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.689544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.689935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.690321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.692821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.693233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.693623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.694022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.694407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.694905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.695312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.695706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.696105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.698650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.699063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.699456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.699853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.700281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.700776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.701184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.701575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.701973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.704483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.704887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.705290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.705683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.705977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.706472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.706884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.707287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.707674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.710109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.710512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.710902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.711300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.711668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.712172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.712570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.848 [2024-07-15 09:36:28.712986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.713378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.715895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.716300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.716356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.716743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.717146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.717638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.718038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.718427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.718827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.723527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.723584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.725168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.725217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.725653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.727487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.727877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.729515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.731313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.735484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.735543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.736535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.736580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.736857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.737595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.737650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.738704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.738753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.741676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.741734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.743373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.743420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.743734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.744838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.744894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.746528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.746575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.750409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.750467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.751756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.751819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.752098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.752587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.752641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.753030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.753074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.758149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.758207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.758815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.758863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.759144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.759752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.759806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.761029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.761081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.765576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.765634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.766997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.767044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.767458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.767971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.768030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.769594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.769644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.776121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.776181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.776568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.776613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.777051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.778625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.778678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.780214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.780261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.786436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.786495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.787883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.787942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.788396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.788900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.788973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.790602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.790656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.795407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:19.849 [2024-07-15 09:36:28.795470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.797136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.797194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.797588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.798100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.798150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.798916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.798975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.804587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.804646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.806039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.806089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.806363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.807092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.807144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.807526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.807568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.812864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.812922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.814304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.814350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.814656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.816464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.816524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.816917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.816970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.821970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.822025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.823717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.823770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.824049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.111 [2024-07-15 09:36:28.825654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.825708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.827515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.827562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.832186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.832244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.833230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.833281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.833555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.835342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.835402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.837182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.837235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.841096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.841154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.842798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.842845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.843238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.845120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.845171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.846875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.846921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.851983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.852041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.853312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.853358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.853668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.855164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.855218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.856848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.856894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.862030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.862087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.862953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.863001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.863277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.863920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.863980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.865546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.865592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.870626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.870689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.872417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.872475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.872746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.873252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.873304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.873688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.873731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.879643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.879707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.881490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.881534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.881809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.883607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.883661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.883702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.883744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.888506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.888563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.889804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.894597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.894654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.894708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.894762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.895043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.895199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.895247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.895288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.895328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.899963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.903797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.905100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.905150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.905190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.112 [2024-07-15 09:36:28.905239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.905511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.905667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.905713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.905754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.905795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.907465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.907518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.907562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.907603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.907874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.908040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.908085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.908126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.908166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.909414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.909471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.909512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.909552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.909841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.911610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.911664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.911705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.911746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.913988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.914033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.914078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.914127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.915422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.915472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.915513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.915554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.915868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.916032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.916077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.916120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.916168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.917476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.917527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.917568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.917610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.918064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.918213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.918262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.918303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.918343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.919554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.919606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.919646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.919687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.920012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.920166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.920215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.920255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.920296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.921514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.921563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.921608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.921654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.921981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.922133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.922177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.922223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.922265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.923531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.923581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.923622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.923662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.923979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.924129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.924174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.924221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.924265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.925552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.925603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.927610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.928919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.930300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.930348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.931736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.932018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.932170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.932215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.932268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.932310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.933550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.935210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.935259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.936001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.113 [2024-07-15 09:36:28.936382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.936535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.937247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.937294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.938656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.939875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.941463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.941512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.943105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.943446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.943593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.945268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.945324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.945708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.947012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.948399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.948447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.950059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.950480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.950633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.952178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.952227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.953725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.954982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.955494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.955539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.957366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.957829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.957997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.959687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.959739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.961383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.962629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.964013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.964062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.965457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.965732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.965881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.966905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.966960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.967344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.971777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.972440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.972489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.973868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.974153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.974304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.975956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.976004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.976960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.980064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.981716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.981764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.982537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.982837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.983007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.984398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.984461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.986109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.987521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.989080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.989129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.989760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.990041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.990194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.991582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.991638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.992684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.993997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.994394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.994439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.994978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.995253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.995404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.995977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.996028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.997182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.998613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.999745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:28.999794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.001417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.001765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.001916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.003233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.003287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.004944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.006168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.007408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.007457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.008819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.009137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.009290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.010934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.010989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.012646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.014697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.016408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.016465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.018095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.018465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.114 [2024-07-15 09:36:29.018617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.020088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.020144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.021678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.022924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.024351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.024403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.025775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.026151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.026307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.027919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.027988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.029503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.030863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.031797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.031845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.033228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.033540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.033689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.034810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.034856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.036617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.037817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.038222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.038267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.038960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.039234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.039386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.041074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.041127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.042089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.043411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.043806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.043851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.045015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.045382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.045534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.046730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.046778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.047332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.048681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.049093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.049146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.049529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.049963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.050116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.050509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.050900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.051298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.052683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.053098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.053148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.053533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.053849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.054361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.054758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.055154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.055539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.057139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.057553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.057961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.058355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.058817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.059340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.059751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.060158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.115 [2024-07-15 09:36:29.060549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.062417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.062817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.063218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.063611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.063979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.064471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.064865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.065269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.065661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.068169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.068574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.068976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.069375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.069783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.070296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.070703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.071108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.071494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.074072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.074482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.074887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.075280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.075660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.076158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.076553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.076944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.077332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.079046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.079462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.079852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.080248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.080600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.081101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.081493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.081879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.082271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.084015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.084413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.084805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.085204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.085552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.085706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.086106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.086497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.086887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.088563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.088985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.089542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.090937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.091213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.093125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.094772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.095895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.097278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.098793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.099196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.100768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.101698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.102027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.103734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.104755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.105672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.106063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.107853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.108253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.109729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.111090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.111406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.112409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.114154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.115755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.117241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.120432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.120835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.122553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.122946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.123365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.123860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.125536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.125938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.127528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.131092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.132297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.133037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.134695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.135022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.135855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.136257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.377 [2024-07-15 09:36:29.136648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.138141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.139839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.140255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.141939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.143440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.143746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.145224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.146427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.146815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.147354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.149172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.150566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.150614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.152270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.152593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.153089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.153479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.155259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.156923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.159646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.159703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.161197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.161242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.161515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.162935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.163325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.163720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.165394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.167469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.167526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.169062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.169110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.169384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.171258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.171311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.172735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.172778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.177491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.177548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.178222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.178269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.178541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.180318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.180391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.182037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.182082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.184735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.184792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.186278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.186324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.186637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.188402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.188464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.189341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.189386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.192044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.192114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.192507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.192551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.192972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.194772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.194824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.196473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.196523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.202498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.202554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.203813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.203857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.204320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.204815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.204868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.206411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.206460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.208310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.208370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.209731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.209776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.210090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.211848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.211901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.212681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.212725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.215833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.215889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.217591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.217637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.217910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.219216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.219270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.220631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.220676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.222235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.222290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.222905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.222957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.223256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.224769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.224823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.378 [2024-07-15 09:36:29.226472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.226517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.229100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.229156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.230798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.230845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.231213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.231713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.231763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.233036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.233084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.235379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.235433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.237200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.237244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.237518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.239266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.239326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.240943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.240989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.243643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.243698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.245083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.245128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.245402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.246164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.246217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.247595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.247640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.249156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.249211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.249596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.249638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.249912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.251521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.251574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.253163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.253220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.255967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.256022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.257659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.257703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.257988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.258485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.258538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.258923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.258976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.261765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.261828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.262679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.262725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.263049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.264562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.264615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.266259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.266305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.270076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.270132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.271797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.271853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.272132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.273845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.273897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.275232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.275277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.277516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.277573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.277964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.278012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.278335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.279796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.279848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.281246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.281291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.283763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.283818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.285221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.285266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.285540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.286291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.286343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.286384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.286425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.289014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.289070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.290710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.290754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.291081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.291238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.291287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.291328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.291368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.294282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.294340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.294382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.294424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.294899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.295063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.295114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.379 [2024-07-15 09:36:29.295156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.295197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.296996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.297036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.297076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.298961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.299003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.299043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.300476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.300526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.300567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.300607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.300917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.301076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.301120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.301161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.301201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.302446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.302500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.302540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.302581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.302903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.303062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.303107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.303147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.303188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.304445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.304496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.304537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.304579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.304969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.306388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.306440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.306481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.306528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.307711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.307760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.307815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.307856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.308137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.308288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.308333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.308379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.308427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.309606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.309656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.309703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.309746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.310148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.310294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.310338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.310395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.310437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.311730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.311787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.311828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.311874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.312152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.312307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.312352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.312393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.312433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.313751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.313801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.313849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.313892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.314171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.314322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.314367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.314407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.314448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.315979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.316683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.317845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.317896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.317948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.317989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.318260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.318410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.318454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.318495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.380 [2024-07-15 09:36:29.318536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.319709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.319773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.320836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.322021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.323083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.323142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.324873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.325165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.325322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.325379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.325423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.325463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.381 [2024-07-15 09:36:29.326689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.642 [2024-07-15 09:36:29.327121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.642 [2024-07-15 09:36:29.327169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.642 [2024-07-15 09:36:29.328792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.329079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.329229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.330995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.331043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.332497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.333649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.335312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.335365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.336220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.336630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.336782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.337186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.337242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.337652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.339641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.341334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.341389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.343138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.343512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.343662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.345177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.345226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.346694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.347954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.349320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.349368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.350751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.351031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.351185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.351827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.351874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.353192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.354316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.354715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.354761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.355152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.355509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.355658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.356994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.357039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.357683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.358916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.360035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.360085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.361456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.361729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.361878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.362483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.362534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.364296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.365530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.365936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.365983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.367218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.367593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.367742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.369389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.369437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.370225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.371453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.372118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.372166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.372551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.372834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.372997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.374383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.374429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.375715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.376893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.377301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.377346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.377730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.378010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.378163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.379727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.379776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.381253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.382498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.384082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.384140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.384531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.384898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.385054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.386444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.386490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.387487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.388700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.389884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.389939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.390324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.390790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.390951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.392613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.392681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.393077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.394462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.394863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.394917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.643 [2024-07-15 09:36:29.395319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.395664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.395818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.396223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.396268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.396653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.398290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.398691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.398745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.399143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.399582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.399734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.400132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.400193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.400580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.402226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.402624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.402674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.403077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.403440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.403590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.403987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.404030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.404419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.405921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.406326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.406380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.406774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.407178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.407329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.407717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.407760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.408149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.409617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.410037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.410082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.410474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.410836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.410996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.411399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.411443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.411828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.413204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.413615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.413659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.414060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.414437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.414586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.414987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.415372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.415758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.417095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.417493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.417546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.417939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.418323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.418809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.419213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.419600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.419996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.421318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.421726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.422122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.422510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.423012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.423504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.423908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.424302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.424692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.426392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.426789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.427192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.427576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.427890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.428388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.428781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.429181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.429572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.431452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.431848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.432248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.432637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.433041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.433723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.435376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.436586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.644 [2024-07-15 09:36:29.437336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.438870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.440436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.441853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.443266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.443629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.445549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.447231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.448764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.449185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.451071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.452536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.453046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.453436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.453879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.455024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.456649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.457442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.458651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.460373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.462139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.463349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.464093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.464364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.464513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.466004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.466397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.466785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.468961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.469900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.470298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.470685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.471044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.472006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.473063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.473941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.474346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.476644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.477590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.477985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.478372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.478646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.479637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.480913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.482349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.483232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.484911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.486306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.486698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.488292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.488667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.489166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.489556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.491286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.492997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.494494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.494888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.496368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.497731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.498015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.498756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.500371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.501803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.503449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.507083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.508856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.510573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.512377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.512712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.514188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.515701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.517346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.518549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.521223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.522620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.524257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.525234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.525506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.527080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.528459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.530103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.530602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.533528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.535360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.535416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.536876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.537158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.538638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.540027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.541662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.542630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.545687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.545742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.547265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.547311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.547587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.548514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.549902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.551290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.552939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.645 [2024-07-15 09:36:29.555452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.555509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.556914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.556964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.557307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.559062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.559115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.559914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.559962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.562700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.562756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.563152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.563201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.563644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.565307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.565364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.567009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.567064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.569742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.569799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.571265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.571311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.571585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.573098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.573150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.573552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.573618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.576313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.576369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.578003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.578049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.578396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.579906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.579966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.581348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.581394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.582911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.582976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.584395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.584441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.584785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.586289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.586345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.587998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.588054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.590860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.590919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.592160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.592240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.646 [2024-07-15 09:36:29.592706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.593246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.593315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.594692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.594738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.596777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.596833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.598206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.598252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.598565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.600339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.600395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.600975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.601027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.603900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.603966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.605607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.605654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.605937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.607450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.607504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.608982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.609043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.610625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.610682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.611361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.611407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.611750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.613240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.613294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.614917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.614969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.617549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.617606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.619249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.619296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.619654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.620154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.620209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.621178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.621225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.623791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.623850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.625392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.625438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.625731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.627230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.627285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.628946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.628999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.631609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.631666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.633059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.633104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.633379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.634728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.634783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:20.906 [2024-07-15 09:36:29.636475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:21.877 00:32:21.877 Latency(us) 00:32:21.877 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:21.877 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x0 length 0x100 00:32:21.877 crypto_ram : 5.71 44.82 2.80 0.00 0.00 2737574.07 155918.69 2436343.54 00:32:21.877 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x100 length 0x100 00:32:21.877 crypto_ram : 5.93 43.15 2.70 0.00 0.00 2885963.24 55392.17 2596821.26 00:32:21.877 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x0 length 0x100 00:32:21.877 crypto_ram2 : 5.72 46.82 2.93 0.00 0.00 2541432.97 6468.12 2377988.01 00:32:21.877 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x100 length 0x100 00:32:21.877 crypto_ram2 : 5.93 43.15 2.70 0.00 0.00 2776474.49 54936.26 2567643.49 00:32:21.877 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x0 length 0x100 00:32:21.877 crypto_ram3 : 5.54 324.68 20.29 0.00 0.00 351670.13 17096.35 590849.78 00:32:21.877 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x100 length 0x100 00:32:21.877 crypto_ram3 : 5.72 244.09 15.26 0.00 0.00 470378.99 31457.28 915452.44 00:32:21.877 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x0 length 0x100 00:32:21.877 crypto_ram4 : 5.70 351.70 21.98 0.00 0.00 314640.73 11625.52 506963.70 00:32:21.877 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:21.877 Verification LBA range: start 0x100 length 0x100 00:32:21.877 crypto_ram4 : 5.73 245.90 15.37 0.00 0.00 449246.49 23478.98 915452.44 00:32:21.877 =================================================================================================================== 00:32:21.877 Total : 1344.30 84.02 0.00 0.00 704319.04 6468.12 2596821.26 00:32:22.445 00:32:22.445 real 0m9.208s 00:32:22.445 user 0m16.737s 00:32:22.445 sys 0m0.495s 00:32:22.445 09:36:31 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:22.445 09:36:31 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:22.445 ************************************ 00:32:22.445 END TEST bdev_verify_big_io 00:32:22.445 ************************************ 00:32:22.445 09:36:31 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:22.445 09:36:31 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:22.445 09:36:31 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:22.445 09:36:31 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:22.445 09:36:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:22.445 ************************************ 00:32:22.445 START TEST bdev_write_zeroes 00:32:22.445 ************************************ 00:32:22.445 09:36:31 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:22.445 [2024-07-15 09:36:31.248425] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:22.445 [2024-07-15 09:36:31.248493] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274171 ] 00:32:22.445 [2024-07-15 09:36:31.378405] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:22.704 [2024-07-15 09:36:31.487582] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:22.704 [2024-07-15 09:36:31.508921] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:22.704 [2024-07-15 09:36:31.516954] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:22.704 [2024-07-15 09:36:31.524970] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:22.704 [2024-07-15 09:36:31.632551] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:25.235 [2024-07-15 09:36:33.871676] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:25.235 [2024-07-15 09:36:33.871748] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:25.235 [2024-07-15 09:36:33.871764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:25.235 [2024-07-15 09:36:33.879692] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:25.235 [2024-07-15 09:36:33.879713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:25.235 [2024-07-15 09:36:33.879725] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:25.235 [2024-07-15 09:36:33.887713] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:25.235 [2024-07-15 09:36:33.887732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:25.235 [2024-07-15 09:36:33.887743] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:25.235 [2024-07-15 09:36:33.895733] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:25.236 [2024-07-15 09:36:33.895751] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:25.236 [2024-07-15 09:36:33.895764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:25.236 Running I/O for 1 seconds... 00:32:26.182 00:32:26.182 Latency(us) 00:32:26.182 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:26.182 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:26.182 crypto_ram : 1.03 1972.52 7.71 0.00 0.00 64436.74 5442.34 77047.54 00:32:26.182 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:26.182 crypto_ram2 : 1.03 1978.24 7.73 0.00 0.00 63897.32 5385.35 71576.71 00:32:26.182 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:26.182 crypto_ram3 : 1.02 15166.34 59.24 0.00 0.00 8316.25 2464.72 10770.70 00:32:26.182 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:26.182 crypto_ram4 : 1.02 15150.98 59.18 0.00 0.00 8288.39 2464.72 8662.15 00:32:26.182 =================================================================================================================== 00:32:26.182 Total : 34268.08 133.86 0.00 0.00 14770.03 2464.72 77047.54 00:32:26.749 00:32:26.749 real 0m4.269s 00:32:26.749 user 0m3.827s 00:32:26.749 sys 0m0.401s 00:32:26.749 09:36:35 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:26.749 09:36:35 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:26.749 ************************************ 00:32:26.749 END TEST bdev_write_zeroes 00:32:26.749 ************************************ 00:32:26.749 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:26.749 09:36:35 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:26.749 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:26.749 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:26.749 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:26.749 ************************************ 00:32:26.749 START TEST bdev_json_nonenclosed 00:32:26.749 ************************************ 00:32:26.750 09:36:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:26.750 [2024-07-15 09:36:35.596031] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:26.750 [2024-07-15 09:36:35.596095] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274727 ] 00:32:27.009 [2024-07-15 09:36:35.724639] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:27.009 [2024-07-15 09:36:35.821649] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:27.009 [2024-07-15 09:36:35.821719] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:27.009 [2024-07-15 09:36:35.821739] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:27.009 [2024-07-15 09:36:35.821751] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:27.009 00:32:27.009 real 0m0.385s 00:32:27.009 user 0m0.221s 00:32:27.009 sys 0m0.161s 00:32:27.009 09:36:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:27.009 09:36:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:27.009 09:36:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:27.009 ************************************ 00:32:27.009 END TEST bdev_json_nonenclosed 00:32:27.009 ************************************ 00:32:27.009 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:27.009 09:36:35 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:32:27.009 09:36:35 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:27.009 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:27.009 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:27.009 09:36:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:27.267 ************************************ 00:32:27.267 START TEST bdev_json_nonarray 00:32:27.267 ************************************ 00:32:27.267 09:36:35 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:27.267 [2024-07-15 09:36:36.046669] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:27.267 [2024-07-15 09:36:36.046730] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274912 ] 00:32:27.267 [2024-07-15 09:36:36.171743] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:27.527 [2024-07-15 09:36:36.268655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:27.527 [2024-07-15 09:36:36.268729] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:27.527 [2024-07-15 09:36:36.268749] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:27.527 [2024-07-15 09:36:36.268764] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:27.527 00:32:27.527 real 0m0.385s 00:32:27.527 user 0m0.221s 00:32:27.527 sys 0m0.161s 00:32:27.527 09:36:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:32:27.527 09:36:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:27.527 09:36:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:27.527 ************************************ 00:32:27.527 END TEST bdev_json_nonarray 00:32:27.527 ************************************ 00:32:27.527 09:36:36 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:32:27.527 09:36:36 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:32:27.527 00:32:27.527 real 1m12.655s 00:32:27.527 user 2m39.498s 00:32:27.527 sys 0m9.090s 00:32:27.527 09:36:36 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:27.527 09:36:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:27.527 ************************************ 00:32:27.527 END TEST blockdev_crypto_aesni 00:32:27.527 ************************************ 00:32:27.527 09:36:36 -- common/autotest_common.sh@1142 -- # return 0 00:32:27.527 09:36:36 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:27.527 09:36:36 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:27.527 09:36:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:27.527 09:36:36 -- common/autotest_common.sh@10 -- # set +x 00:32:27.786 ************************************ 00:32:27.786 START TEST blockdev_crypto_sw 00:32:27.786 ************************************ 00:32:27.786 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:27.786 * Looking for test storage... 00:32:27.786 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:27.786 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=274986 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:27.787 09:36:36 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 274986 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 274986 ']' 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:27.787 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:27.787 09:36:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:27.787 [2024-07-15 09:36:36.692354] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:27.787 [2024-07-15 09:36:36.692430] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274986 ] 00:32:28.045 [2024-07-15 09:36:36.818461] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:28.045 [2024-07-15 09:36:36.921175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:28.613 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:28.613 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:32:28.613 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:28.613 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:32:28.613 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:32:28.613 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:28.613 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:28.872 Malloc0 00:32:28.872 Malloc1 00:32:28.872 true 00:32:28.872 true 00:32:28.872 true 00:32:28.872 [2024-07-15 09:36:37.817358] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:28.872 crypto_ram 00:32:29.131 [2024-07-15 09:36:37.825385] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:29.131 crypto_ram2 00:32:29.131 [2024-07-15 09:36:37.833411] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:29.131 crypto_ram3 00:32:29.131 [ 00:32:29.131 { 00:32:29.131 "name": "Malloc1", 00:32:29.131 "aliases": [ 00:32:29.131 "6a953b94-07a6-43ca-a019-21aae3dd055d" 00:32:29.131 ], 00:32:29.131 "product_name": "Malloc disk", 00:32:29.131 "block_size": 4096, 00:32:29.132 "num_blocks": 4096, 00:32:29.132 "uuid": "6a953b94-07a6-43ca-a019-21aae3dd055d", 00:32:29.132 "assigned_rate_limits": { 00:32:29.132 "rw_ios_per_sec": 0, 00:32:29.132 "rw_mbytes_per_sec": 0, 00:32:29.132 "r_mbytes_per_sec": 0, 00:32:29.132 "w_mbytes_per_sec": 0 00:32:29.132 }, 00:32:29.132 "claimed": true, 00:32:29.132 "claim_type": "exclusive_write", 00:32:29.132 "zoned": false, 00:32:29.132 "supported_io_types": { 00:32:29.132 "read": true, 00:32:29.132 "write": true, 00:32:29.132 "unmap": true, 00:32:29.132 "flush": true, 00:32:29.132 "reset": true, 00:32:29.132 "nvme_admin": false, 00:32:29.132 "nvme_io": false, 00:32:29.132 "nvme_io_md": false, 00:32:29.132 "write_zeroes": true, 00:32:29.132 "zcopy": true, 00:32:29.132 "get_zone_info": false, 00:32:29.132 "zone_management": false, 00:32:29.132 "zone_append": false, 00:32:29.132 "compare": false, 00:32:29.132 "compare_and_write": false, 00:32:29.132 "abort": true, 00:32:29.132 "seek_hole": false, 00:32:29.132 "seek_data": false, 00:32:29.132 "copy": true, 00:32:29.132 "nvme_iov_md": false 00:32:29.132 }, 00:32:29.132 "memory_domains": [ 00:32:29.132 { 00:32:29.132 "dma_device_id": "system", 00:32:29.132 "dma_device_type": 1 00:32:29.132 }, 00:32:29.132 { 00:32:29.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:29.132 "dma_device_type": 2 00:32:29.132 } 00:32:29.132 ], 00:32:29.132 "driver_specific": {} 00:32:29.132 } 00:32:29.132 ] 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:29.132 09:36:37 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:29.132 09:36:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1ddb38e8-b23a-5f05-88db-85aea48d9507"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1ddb38e8-b23a-5f05-88db-85aea48d9507",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:29.132 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 274986 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 274986 ']' 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 274986 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:29.132 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 274986 00:32:29.392 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:29.392 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:29.392 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 274986' 00:32:29.392 killing process with pid 274986 00:32:29.392 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 274986 00:32:29.392 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 274986 00:32:29.650 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:29.650 09:36:38 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:29.650 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:29.650 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:29.650 09:36:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:29.650 ************************************ 00:32:29.650 START TEST bdev_hello_world 00:32:29.650 ************************************ 00:32:29.650 09:36:38 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:29.650 [2024-07-15 09:36:38.551249] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:29.650 [2024-07-15 09:36:38.551312] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid275215 ] 00:32:29.911 [2024-07-15 09:36:38.679822] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:29.911 [2024-07-15 09:36:38.778868] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:30.171 [2024-07-15 09:36:38.955805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:30.171 [2024-07-15 09:36:38.955865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:30.171 [2024-07-15 09:36:38.955881] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.171 [2024-07-15 09:36:38.963823] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:30.171 [2024-07-15 09:36:38.963843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:30.171 [2024-07-15 09:36:38.963854] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.171 [2024-07-15 09:36:38.971860] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:30.171 [2024-07-15 09:36:38.971886] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:30.171 [2024-07-15 09:36:38.971898] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.171 [2024-07-15 09:36:39.012228] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:30.171 [2024-07-15 09:36:39.012265] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:30.171 [2024-07-15 09:36:39.012285] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:30.171 [2024-07-15 09:36:39.014323] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:30.171 [2024-07-15 09:36:39.014392] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:30.171 [2024-07-15 09:36:39.014408] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:30.171 [2024-07-15 09:36:39.014440] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:30.171 00:32:30.171 [2024-07-15 09:36:39.014459] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:30.430 00:32:30.430 real 0m0.724s 00:32:30.430 user 0m0.486s 00:32:30.430 sys 0m0.218s 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:30.430 ************************************ 00:32:30.430 END TEST bdev_hello_world 00:32:30.430 ************************************ 00:32:30.430 09:36:39 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:30.430 09:36:39 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:30.430 09:36:39 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:30.430 09:36:39 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:30.430 09:36:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:30.430 ************************************ 00:32:30.430 START TEST bdev_bounds 00:32:30.430 ************************************ 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=275372 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 275372' 00:32:30.430 Process bdevio pid: 275372 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 275372 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 275372 ']' 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:30.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:30.430 09:36:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:30.430 [2024-07-15 09:36:39.356126] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:30.430 [2024-07-15 09:36:39.356193] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid275372 ] 00:32:30.689 [2024-07-15 09:36:39.484772] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:30.689 [2024-07-15 09:36:39.592905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:30.689 [2024-07-15 09:36:39.593005] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:30.689 [2024-07-15 09:36:39.593011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:30.949 [2024-07-15 09:36:39.763806] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:30.949 [2024-07-15 09:36:39.763878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:30.949 [2024-07-15 09:36:39.763893] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.949 [2024-07-15 09:36:39.771828] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:30.949 [2024-07-15 09:36:39.771848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:30.949 [2024-07-15 09:36:39.771859] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.949 [2024-07-15 09:36:39.779849] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:30.949 [2024-07-15 09:36:39.779868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:30.949 [2024-07-15 09:36:39.779880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:31.517 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:31.517 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:31.517 09:36:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:31.517 I/O targets: 00:32:31.517 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:32:31.517 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:32:31.517 00:32:31.517 00:32:31.517 CUnit - A unit testing framework for C - Version 2.1-3 00:32:31.517 http://cunit.sourceforge.net/ 00:32:31.517 00:32:31.517 00:32:31.517 Suite: bdevio tests on: crypto_ram3 00:32:31.517 Test: blockdev write read block ...passed 00:32:31.517 Test: blockdev write zeroes read block ...passed 00:32:31.518 Test: blockdev write zeroes read no split ...passed 00:32:31.518 Test: blockdev write zeroes read split ...passed 00:32:31.518 Test: blockdev write zeroes read split partial ...passed 00:32:31.518 Test: blockdev reset ...passed 00:32:31.518 Test: blockdev write read 8 blocks ...passed 00:32:31.518 Test: blockdev write read size > 128k ...passed 00:32:31.518 Test: blockdev write read invalid size ...passed 00:32:31.518 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:31.518 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:31.518 Test: blockdev write read max offset ...passed 00:32:31.518 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:31.518 Test: blockdev writev readv 8 blocks ...passed 00:32:31.518 Test: blockdev writev readv 30 x 1block ...passed 00:32:31.518 Test: blockdev writev readv block ...passed 00:32:31.518 Test: blockdev writev readv size > 128k ...passed 00:32:31.518 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:31.518 Test: blockdev comparev and writev ...passed 00:32:31.518 Test: blockdev nvme passthru rw ...passed 00:32:31.518 Test: blockdev nvme passthru vendor specific ...passed 00:32:31.518 Test: blockdev nvme admin passthru ...passed 00:32:31.518 Test: blockdev copy ...passed 00:32:31.518 Suite: bdevio tests on: crypto_ram 00:32:31.518 Test: blockdev write read block ...passed 00:32:31.518 Test: blockdev write zeroes read block ...passed 00:32:31.518 Test: blockdev write zeroes read no split ...passed 00:32:31.518 Test: blockdev write zeroes read split ...passed 00:32:31.518 Test: blockdev write zeroes read split partial ...passed 00:32:31.518 Test: blockdev reset ...passed 00:32:31.518 Test: blockdev write read 8 blocks ...passed 00:32:31.518 Test: blockdev write read size > 128k ...passed 00:32:31.518 Test: blockdev write read invalid size ...passed 00:32:31.518 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:31.518 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:31.518 Test: blockdev write read max offset ...passed 00:32:31.518 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:31.518 Test: blockdev writev readv 8 blocks ...passed 00:32:31.518 Test: blockdev writev readv 30 x 1block ...passed 00:32:31.518 Test: blockdev writev readv block ...passed 00:32:31.518 Test: blockdev writev readv size > 128k ...passed 00:32:31.518 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:31.518 Test: blockdev comparev and writev ...passed 00:32:31.518 Test: blockdev nvme passthru rw ...passed 00:32:31.518 Test: blockdev nvme passthru vendor specific ...passed 00:32:31.518 Test: blockdev nvme admin passthru ...passed 00:32:31.518 Test: blockdev copy ...passed 00:32:31.518 00:32:31.518 Run Summary: Type Total Ran Passed Failed Inactive 00:32:31.518 suites 2 2 n/a 0 0 00:32:31.518 tests 46 46 46 0 0 00:32:31.518 asserts 260 260 260 0 n/a 00:32:31.518 00:32:31.518 Elapsed time = 0.083 seconds 00:32:31.518 0 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 275372 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 275372 ']' 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 275372 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 275372 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 275372' 00:32:31.777 killing process with pid 275372 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 275372 00:32:31.777 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 275372 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:32.036 00:32:32.036 real 0m1.437s 00:32:32.036 user 0m3.748s 00:32:32.036 sys 0m0.382s 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:32.036 ************************************ 00:32:32.036 END TEST bdev_bounds 00:32:32.036 ************************************ 00:32:32.036 09:36:40 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:32.036 09:36:40 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:32.036 09:36:40 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:32.036 09:36:40 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:32.036 09:36:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:32.036 ************************************ 00:32:32.036 START TEST bdev_nbd 00:32:32.036 ************************************ 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=275581 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 275581 /var/tmp/spdk-nbd.sock 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 275581 ']' 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:32.036 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:32.036 09:36:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:32.036 [2024-07-15 09:36:40.869520] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:32:32.036 [2024-07-15 09:36:40.869585] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:32.295 [2024-07-15 09:36:40.997648] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:32.295 [2024-07-15 09:36:41.099232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:32.554 [2024-07-15 09:36:41.272937] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:32.554 [2024-07-15 09:36:41.272999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:32.554 [2024-07-15 09:36:41.273014] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:32.554 [2024-07-15 09:36:41.280955] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:32.554 [2024-07-15 09:36:41.280976] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:32.554 [2024-07-15 09:36:41.280987] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:32.554 [2024-07-15 09:36:41.288975] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:32.554 [2024-07-15 09:36:41.288994] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:32.554 [2024-07-15 09:36:41.289005] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.122 09:36:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:33.122 09:36:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:33.122 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:33.122 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:33.123 09:36:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:33.123 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:33.382 1+0 records in 00:32:33.382 1+0 records out 00:32:33.382 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265239 s, 15.4 MB/s 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:33.382 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:33.642 1+0 records in 00:32:33.642 1+0 records out 00:32:33.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339826 s, 12.1 MB/s 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:33.642 { 00:32:33.642 "nbd_device": "/dev/nbd0", 00:32:33.642 "bdev_name": "crypto_ram" 00:32:33.642 }, 00:32:33.642 { 00:32:33.642 "nbd_device": "/dev/nbd1", 00:32:33.642 "bdev_name": "crypto_ram3" 00:32:33.642 } 00:32:33.642 ]' 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:33.642 { 00:32:33.642 "nbd_device": "/dev/nbd0", 00:32:33.642 "bdev_name": "crypto_ram" 00:32:33.642 }, 00:32:33.642 { 00:32:33.642 "nbd_device": "/dev/nbd1", 00:32:33.642 "bdev_name": "crypto_ram3" 00:32:33.642 } 00:32:33.642 ]' 00:32:33.642 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:33.901 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:34.160 09:36:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:34.419 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:34.679 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:34.939 /dev/nbd0 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:34.939 1+0 records in 00:32:34.939 1+0 records out 00:32:34.939 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240745 s, 17.0 MB/s 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:34.939 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:32:35.198 /dev/nbd1 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:35.198 09:36:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:35.198 1+0 records in 00:32:35.198 1+0 records out 00:32:35.198 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298405 s, 13.7 MB/s 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:35.198 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:35.199 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:35.199 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:35.458 { 00:32:35.458 "nbd_device": "/dev/nbd0", 00:32:35.458 "bdev_name": "crypto_ram" 00:32:35.458 }, 00:32:35.458 { 00:32:35.458 "nbd_device": "/dev/nbd1", 00:32:35.458 "bdev_name": "crypto_ram3" 00:32:35.458 } 00:32:35.458 ]' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:35.458 { 00:32:35.458 "nbd_device": "/dev/nbd0", 00:32:35.458 "bdev_name": "crypto_ram" 00:32:35.458 }, 00:32:35.458 { 00:32:35.458 "nbd_device": "/dev/nbd1", 00:32:35.458 "bdev_name": "crypto_ram3" 00:32:35.458 } 00:32:35.458 ]' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:35.458 /dev/nbd1' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:35.458 /dev/nbd1' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:35.458 256+0 records in 00:32:35.458 256+0 records out 00:32:35.458 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115019 s, 91.2 MB/s 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:35.458 256+0 records in 00:32:35.458 256+0 records out 00:32:35.458 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0291182 s, 36.0 MB/s 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:35.458 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:35.717 256+0 records in 00:32:35.717 256+0 records out 00:32:35.717 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0420132 s, 25.0 MB/s 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:35.717 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:35.976 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:36.234 09:36:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:36.493 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:36.751 malloc_lvol_verify 00:32:36.751 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:37.010 e4c95eaa-25a7-40d5-bbad-767088d6415e 00:32:37.010 09:36:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:37.268 4a29f1c8-4a87-49ea-91ce-c71b45f34d9b 00:32:37.269 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:37.613 /dev/nbd0 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:37.613 mke2fs 1.46.5 (30-Dec-2021) 00:32:37.613 Discarding device blocks: 0/4096 done 00:32:37.613 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:37.613 00:32:37.613 Allocating group tables: 0/1 done 00:32:37.613 Writing inode tables: 0/1 done 00:32:37.613 Creating journal (1024 blocks): done 00:32:37.613 Writing superblocks and filesystem accounting information: 0/1 done 00:32:37.613 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:37.613 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 275581 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 275581 ']' 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 275581 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 275581 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 275581' 00:32:37.872 killing process with pid 275581 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 275581 00:32:37.872 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 275581 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:38.129 00:32:38.129 real 0m6.028s 00:32:38.129 user 0m8.669s 00:32:38.129 sys 0m2.399s 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:38.129 ************************************ 00:32:38.129 END TEST bdev_nbd 00:32:38.129 ************************************ 00:32:38.129 09:36:46 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:38.129 09:36:46 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:38.129 09:36:46 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:32:38.129 09:36:46 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:32:38.129 09:36:46 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:38.129 09:36:46 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:38.129 09:36:46 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.129 09:36:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.129 ************************************ 00:32:38.129 START TEST bdev_fio 00:32:38.129 ************************************ 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:38.129 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:38.129 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.130 09:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:38.130 ************************************ 00:32:38.130 START TEST bdev_fio_rw_verify 00:32:38.130 ************************************ 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:38.130 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:38.391 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:38.391 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:38.391 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:38.391 09:36:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:38.648 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:38.648 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:38.648 fio-3.35 00:32:38.648 Starting 2 threads 00:32:50.838 00:32:50.838 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=276686: Mon Jul 15 09:36:57 2024 00:32:50.838 read: IOPS=21.8k, BW=85.1MiB/s (89.2MB/s)(851MiB/10001msec) 00:32:50.838 slat (nsec): min=14252, max=88922, avg=19981.24, stdev=3570.18 00:32:50.839 clat (usec): min=7, max=441, avg=145.73, stdev=58.09 00:32:50.839 lat (usec): min=25, max=473, avg=165.71, stdev=59.48 00:32:50.839 clat percentiles (usec): 00:32:50.839 | 50.000th=[ 143], 99.000th=[ 277], 99.900th=[ 302], 99.990th=[ 347], 00:32:50.839 | 99.999th=[ 408] 00:32:50.839 write: IOPS=26.2k, BW=102MiB/s (107MB/s)(972MiB/9483msec); 0 zone resets 00:32:50.839 slat (usec): min=14, max=130, avg=33.82, stdev= 4.23 00:32:50.839 clat (usec): min=24, max=2389, avg=195.97, stdev=89.91 00:32:50.839 lat (usec): min=51, max=2426, avg=229.79, stdev=91.47 00:32:50.839 clat percentiles (usec): 00:32:50.839 | 50.000th=[ 190], 99.000th=[ 388], 99.900th=[ 408], 99.990th=[ 627], 00:32:50.839 | 99.999th=[ 2343] 00:32:50.839 bw ( KiB/s): min=93128, max=105696, per=94.81%, avg=99488.00, stdev=1689.89, samples=38 00:32:50.839 iops : min=23282, max=26424, avg=24872.00, stdev=422.47, samples=38 00:32:50.839 lat (usec) : 10=0.01%, 20=0.01%, 50=4.60%, 100=14.80%, 250=63.24% 00:32:50.839 lat (usec) : 500=17.33%, 750=0.01%, 1000=0.01% 00:32:50.839 lat (msec) : 4=0.01% 00:32:50.839 cpu : usr=99.57%, sys=0.01%, ctx=34, majf=0, minf=375 00:32:50.839 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:50.839 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:50.839 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:50.839 issued rwts: total=217832,248782,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:50.839 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:50.839 00:32:50.839 Run status group 0 (all jobs): 00:32:50.839 READ: bw=85.1MiB/s (89.2MB/s), 85.1MiB/s-85.1MiB/s (89.2MB/s-89.2MB/s), io=851MiB (892MB), run=10001-10001msec 00:32:50.839 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=972MiB (1019MB), run=9483-9483msec 00:32:50.839 00:32:50.839 real 0m11.269s 00:32:50.839 user 0m23.314s 00:32:50.839 sys 0m0.335s 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:50.839 ************************************ 00:32:50.839 END TEST bdev_fio_rw_verify 00:32:50.839 ************************************ 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1ddb38e8-b23a-5f05-88db-85aea48d9507"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1ddb38e8-b23a-5f05-88db-85aea48d9507",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:50.839 crypto_ram3 ]] 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6bb6638b-3d5f-54f0-b95d-56f22ebc6c9b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1ddb38e8-b23a-5f05-88db-85aea48d9507"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1ddb38e8-b23a-5f05-88db-85aea48d9507",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:50.839 ************************************ 00:32:50.839 START TEST bdev_fio_trim 00:32:50.839 ************************************ 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:50.839 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:50.840 09:36:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.840 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.840 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.840 fio-3.35 00:32:50.840 Starting 2 threads 00:33:00.801 00:33:00.801 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=278356: Mon Jul 15 09:37:09 2024 00:33:00.801 write: IOPS=29.6k, BW=116MiB/s (121MB/s)(1156MiB/10001msec); 0 zone resets 00:33:00.801 slat (usec): min=13, max=1970, avg=29.28, stdev=16.58 00:33:00.801 clat (usec): min=37, max=2144, avg=223.14, stdev=218.84 00:33:00.801 lat (usec): min=51, max=2161, avg=252.41, stdev=234.36 00:33:00.801 clat percentiles (usec): 00:33:00.801 | 50.000th=[ 119], 99.000th=[ 709], 99.900th=[ 742], 99.990th=[ 914], 00:33:00.801 | 99.999th=[ 1532] 00:33:00.801 bw ( KiB/s): min=116848, max=119904, per=100.00%, avg=118463.58, stdev=366.40, samples=38 00:33:00.801 iops : min=29212, max=29976, avg=29615.89, stdev=91.60, samples=38 00:33:00.801 trim: IOPS=29.6k, BW=116MiB/s (121MB/s)(1156MiB/10001msec); 0 zone resets 00:33:00.801 slat (usec): min=5, max=537, avg=13.23, stdev= 7.39 00:33:00.801 clat (usec): min=51, max=2161, avg=148.91, stdev=65.59 00:33:00.801 lat (usec): min=57, max=2169, avg=162.15, stdev=70.89 00:33:00.801 clat percentiles (usec): 00:33:00.801 | 50.000th=[ 137], 99.000th=[ 322], 99.900th=[ 343], 99.990th=[ 408], 00:33:00.801 | 99.999th=[ 2114] 00:33:00.801 bw ( KiB/s): min=116840, max=119904, per=100.00%, avg=118465.26, stdev=367.17, samples=38 00:33:00.801 iops : min=29210, max=29976, avg=29616.32, stdev=91.79, samples=38 00:33:00.801 lat (usec) : 50=4.84%, 100=25.76%, 250=52.42%, 500=5.93%, 750=11.00% 00:33:00.801 lat (usec) : 1000=0.04% 00:33:00.801 lat (msec) : 2=0.01%, 4=0.01% 00:33:00.801 cpu : usr=99.45%, sys=0.00%, ctx=33, majf=0, minf=258 00:33:00.801 IO depths : 1=9.4%, 2=20.4%, 4=56.1%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:00.801 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:00.801 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:00.801 issued rwts: total=0,295844,295844,0 short=0,0,0,0 dropped=0,0,0,0 00:33:00.801 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:00.801 00:33:00.801 Run status group 0 (all jobs): 00:33:00.801 WRITE: bw=116MiB/s (121MB/s), 116MiB/s-116MiB/s (121MB/s-121MB/s), io=1156MiB (1212MB), run=10001-10001msec 00:33:00.801 TRIM: bw=116MiB/s (121MB/s), 116MiB/s-116MiB/s (121MB/s-121MB/s), io=1156MiB (1212MB), run=10001-10001msec 00:33:00.801 00:33:00.801 real 0m11.096s 00:33:00.801 user 0m23.801s 00:33:00.801 sys 0m0.326s 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:00.801 ************************************ 00:33:00.801 END TEST bdev_fio_trim 00:33:00.801 ************************************ 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:00.801 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:00.801 00:33:00.801 real 0m22.722s 00:33:00.801 user 0m47.299s 00:33:00.801 sys 0m0.855s 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:00.801 ************************************ 00:33:00.801 END TEST bdev_fio 00:33:00.801 ************************************ 00:33:00.801 09:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:00.801 09:37:09 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:00.801 09:37:09 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:00.801 09:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:00.801 09:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.801 09:37:09 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:00.801 ************************************ 00:33:00.801 START TEST bdev_verify 00:33:00.801 ************************************ 00:33:00.801 09:37:09 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:01.059 [2024-07-15 09:37:09.775655] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:01.059 [2024-07-15 09:37:09.775716] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid279615 ] 00:33:01.059 [2024-07-15 09:37:09.903050] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:01.059 [2024-07-15 09:37:10.004719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:01.059 [2024-07-15 09:37:10.004724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:01.318 [2024-07-15 09:37:10.171189] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:01.318 [2024-07-15 09:37:10.171256] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:01.318 [2024-07-15 09:37:10.171272] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.318 [2024-07-15 09:37:10.179210] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:01.318 [2024-07-15 09:37:10.179230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:01.318 [2024-07-15 09:37:10.179242] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.318 [2024-07-15 09:37:10.187231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:01.318 [2024-07-15 09:37:10.187250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:01.318 [2024-07-15 09:37:10.187262] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.318 Running I/O for 5 seconds... 00:33:06.580 00:33:06.580 Latency(us) 00:33:06.580 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:06.580 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:06.580 Verification LBA range: start 0x0 length 0x800 00:33:06.580 crypto_ram : 5.01 5824.74 22.75 0.00 0.00 21883.90 1631.28 27810.06 00:33:06.580 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:06.580 Verification LBA range: start 0x800 length 0x800 00:33:06.580 crypto_ram : 5.01 5828.09 22.77 0.00 0.00 21874.01 1802.24 27582.11 00:33:06.580 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:06.580 Verification LBA range: start 0x0 length 0x800 00:33:06.580 crypto_ram3 : 5.03 2928.98 11.44 0.00 0.00 43453.52 1937.59 32141.13 00:33:06.580 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:06.580 Verification LBA range: start 0x800 length 0x800 00:33:06.580 crypto_ram3 : 5.02 2930.57 11.45 0.00 0.00 43424.96 2251.02 32597.04 00:33:06.580 =================================================================================================================== 00:33:06.580 Total : 17512.38 68.41 0.00 0.00 29107.62 1631.28 32597.04 00:33:06.580 00:33:06.580 real 0m5.784s 00:33:06.580 user 0m10.871s 00:33:06.580 sys 0m0.243s 00:33:06.580 09:37:15 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:06.580 09:37:15 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:06.580 ************************************ 00:33:06.580 END TEST bdev_verify 00:33:06.580 ************************************ 00:33:06.838 09:37:15 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:06.838 09:37:15 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:06.838 09:37:15 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:06.838 09:37:15 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:06.838 09:37:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.838 ************************************ 00:33:06.838 START TEST bdev_verify_big_io 00:33:06.838 ************************************ 00:33:06.838 09:37:15 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:06.838 [2024-07-15 09:37:15.636840] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:06.838 [2024-07-15 09:37:15.636903] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid280337 ] 00:33:06.838 [2024-07-15 09:37:15.767355] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:07.096 [2024-07-15 09:37:15.876719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:07.096 [2024-07-15 09:37:15.876726] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:07.096 [2024-07-15 09:37:16.048601] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:07.096 [2024-07-15 09:37:16.048668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:07.096 [2024-07-15 09:37:16.048683] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.354 [2024-07-15 09:37:16.056622] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:07.354 [2024-07-15 09:37:16.056642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:07.354 [2024-07-15 09:37:16.056654] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.354 [2024-07-15 09:37:16.064646] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:07.354 [2024-07-15 09:37:16.064665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:07.354 [2024-07-15 09:37:16.064676] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.354 Running I/O for 5 seconds... 00:33:12.645 00:33:12.645 Latency(us) 00:33:12.645 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:12.645 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:12.645 Verification LBA range: start 0x0 length 0x80 00:33:12.645 crypto_ram : 5.10 476.92 29.81 0.00 0.00 262078.90 7123.48 351956.81 00:33:12.645 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:12.645 Verification LBA range: start 0x80 length 0x80 00:33:12.645 crypto_ram : 5.10 501.83 31.36 0.00 0.00 249402.70 6696.07 346485.98 00:33:12.645 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:12.645 Verification LBA range: start 0x0 length 0x80 00:33:12.645 crypto_ram3 : 5.29 266.28 16.64 0.00 0.00 452663.39 5812.76 357427.65 00:33:12.645 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:12.645 Verification LBA range: start 0x80 length 0x80 00:33:12.645 crypto_ram3 : 5.28 266.82 16.68 0.00 0.00 453046.99 6496.61 353780.42 00:33:12.645 =================================================================================================================== 00:33:12.645 Total : 1511.86 94.49 0.00 0.00 326727.33 5812.76 357427.65 00:33:12.935 00:33:12.935 real 0m6.081s 00:33:12.935 user 0m11.434s 00:33:12.935 sys 0m0.256s 00:33:12.935 09:37:21 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:12.935 09:37:21 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:12.935 ************************************ 00:33:12.935 END TEST bdev_verify_big_io 00:33:12.935 ************************************ 00:33:12.935 09:37:21 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:12.935 09:37:21 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:12.935 09:37:21 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:12.935 09:37:21 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:12.935 09:37:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:12.935 ************************************ 00:33:12.935 START TEST bdev_write_zeroes 00:33:12.935 ************************************ 00:33:12.935 09:37:21 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:12.935 [2024-07-15 09:37:21.805814] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:12.935 [2024-07-15 09:37:21.805874] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid281219 ] 00:33:13.193 [2024-07-15 09:37:21.934672] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:13.193 [2024-07-15 09:37:22.035576] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:13.451 [2024-07-15 09:37:22.212255] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:13.451 [2024-07-15 09:37:22.212329] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:13.451 [2024-07-15 09:37:22.212345] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.451 [2024-07-15 09:37:22.220273] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:13.451 [2024-07-15 09:37:22.220291] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:13.451 [2024-07-15 09:37:22.220303] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.451 [2024-07-15 09:37:22.228293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:13.451 [2024-07-15 09:37:22.228310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:13.451 [2024-07-15 09:37:22.228322] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.451 Running I/O for 1 seconds... 00:33:14.383 00:33:14.383 Latency(us) 00:33:14.383 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:14.383 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.383 crypto_ram : 1.01 26623.38 104.00 0.00 0.00 4795.58 1289.35 6582.09 00:33:14.383 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.383 crypto_ram3 : 1.01 13284.79 51.89 0.00 0.00 9556.71 5926.73 9858.89 00:33:14.383 =================================================================================================================== 00:33:14.383 Total : 39908.17 155.89 0.00 0.00 6382.62 1289.35 9858.89 00:33:14.641 00:33:14.641 real 0m1.758s 00:33:14.641 user 0m1.499s 00:33:14.641 sys 0m0.237s 00:33:14.641 09:37:23 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:14.641 09:37:23 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:14.641 ************************************ 00:33:14.641 END TEST bdev_write_zeroes 00:33:14.641 ************************************ 00:33:14.641 09:37:23 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:14.641 09:37:23 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:14.641 09:37:23 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:14.641 09:37:23 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:14.641 09:37:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.641 ************************************ 00:33:14.641 START TEST bdev_json_nonenclosed 00:33:14.641 ************************************ 00:33:14.641 09:37:23 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:14.899 [2024-07-15 09:37:23.633028] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:14.899 [2024-07-15 09:37:23.633089] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid281415 ] 00:33:14.899 [2024-07-15 09:37:23.762690] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:15.156 [2024-07-15 09:37:23.863893] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:15.156 [2024-07-15 09:37:23.863963] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:15.156 [2024-07-15 09:37:23.863984] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:15.156 [2024-07-15 09:37:23.863996] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:15.156 00:33:15.156 real 0m0.400s 00:33:15.156 user 0m0.235s 00:33:15.156 sys 0m0.162s 00:33:15.156 09:37:23 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:15.156 09:37:23 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.156 09:37:23 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:15.156 ************************************ 00:33:15.156 END TEST bdev_json_nonenclosed 00:33:15.156 ************************************ 00:33:15.157 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:15.157 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:33:15.157 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:15.157 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:15.157 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.157 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:15.157 ************************************ 00:33:15.157 START TEST bdev_json_nonarray 00:33:15.157 ************************************ 00:33:15.157 09:37:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:15.157 [2024-07-15 09:37:24.109100] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:15.157 [2024-07-15 09:37:24.109161] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid281441 ] 00:33:15.415 [2024-07-15 09:37:24.239866] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:15.415 [2024-07-15 09:37:24.340670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:15.415 [2024-07-15 09:37:24.340750] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:15.415 [2024-07-15 09:37:24.340774] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:15.415 [2024-07-15 09:37:24.340788] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:15.673 00:33:15.673 real 0m0.397s 00:33:15.673 user 0m0.235s 00:33:15.673 sys 0m0.159s 00:33:15.673 09:37:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:15.674 ************************************ 00:33:15.674 END TEST bdev_json_nonarray 00:33:15.674 ************************************ 00:33:15.674 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:15.674 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:15.674 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:15.674 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:15.674 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:15.674 09:37:24 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:15.674 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:15.674 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.674 09:37:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:15.674 ************************************ 00:33:15.674 START TEST bdev_crypto_enomem 00:33:15.674 ************************************ 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=281628 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 281628 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 281628 ']' 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:15.674 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:15.674 09:37:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:15.674 [2024-07-15 09:37:24.589597] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:15.674 [2024-07-15 09:37:24.589666] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid281628 ] 00:33:15.933 [2024-07-15 09:37:24.708691] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:15.933 [2024-07-15 09:37:24.810795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:16.868 true 00:33:16.868 base0 00:33:16.868 true 00:33:16.868 [2024-07-15 09:37:25.551546] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:16.868 crypt0 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:16.868 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:16.868 [ 00:33:16.868 { 00:33:16.868 "name": "crypt0", 00:33:16.868 "aliases": [ 00:33:16.868 "d2eee1ca-2f6f-5ff2-b022-1457a51420b1" 00:33:16.868 ], 00:33:16.868 "product_name": "crypto", 00:33:16.868 "block_size": 512, 00:33:16.868 "num_blocks": 2097152, 00:33:16.868 "uuid": "d2eee1ca-2f6f-5ff2-b022-1457a51420b1", 00:33:16.868 "assigned_rate_limits": { 00:33:16.868 "rw_ios_per_sec": 0, 00:33:16.868 "rw_mbytes_per_sec": 0, 00:33:16.869 "r_mbytes_per_sec": 0, 00:33:16.869 "w_mbytes_per_sec": 0 00:33:16.869 }, 00:33:16.869 "claimed": false, 00:33:16.869 "zoned": false, 00:33:16.869 "supported_io_types": { 00:33:16.869 "read": true, 00:33:16.869 "write": true, 00:33:16.869 "unmap": false, 00:33:16.869 "flush": false, 00:33:16.869 "reset": true, 00:33:16.869 "nvme_admin": false, 00:33:16.869 "nvme_io": false, 00:33:16.869 "nvme_io_md": false, 00:33:16.869 "write_zeroes": true, 00:33:16.869 "zcopy": false, 00:33:16.869 "get_zone_info": false, 00:33:16.869 "zone_management": false, 00:33:16.869 "zone_append": false, 00:33:16.869 "compare": false, 00:33:16.869 "compare_and_write": false, 00:33:16.869 "abort": false, 00:33:16.869 "seek_hole": false, 00:33:16.869 "seek_data": false, 00:33:16.869 "copy": false, 00:33:16.869 "nvme_iov_md": false 00:33:16.869 }, 00:33:16.869 "memory_domains": [ 00:33:16.869 { 00:33:16.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:16.869 "dma_device_type": 2 00:33:16.869 } 00:33:16.869 ], 00:33:16.869 "driver_specific": { 00:33:16.869 "crypto": { 00:33:16.869 "base_bdev_name": "EE_base0", 00:33:16.869 "name": "crypt0", 00:33:16.869 "key_name": "test_dek_sw" 00:33:16.869 } 00:33:16.869 } 00:33:16.869 } 00:33:16.869 ] 00:33:16.869 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:16.869 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:16.869 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=281660 00:33:16.869 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:16.869 09:37:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:16.869 Running I/O for 5 seconds... 00:33:17.803 09:37:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:17.803 09:37:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:17.803 09:37:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:17.803 09:37:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:17.803 09:37:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 281660 00:33:21.986 00:33:21.986 Latency(us) 00:33:21.986 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:21.986 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:33:21.986 crypt0 : 5.00 36214.18 141.46 0.00 0.00 879.87 411.38 1560.04 00:33:21.986 =================================================================================================================== 00:33:21.986 Total : 36214.18 141.46 0.00 0.00 879.87 411.38 1560.04 00:33:21.986 0 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 281628 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 281628 ']' 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 281628 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 281628 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 281628' 00:33:21.986 killing process with pid 281628 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 281628 00:33:21.986 Received shutdown signal, test time was about 5.000000 seconds 00:33:21.986 00:33:21.986 Latency(us) 00:33:21.986 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:21.986 =================================================================================================================== 00:33:21.986 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:21.986 09:37:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 281628 00:33:22.244 09:37:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:33:22.244 00:33:22.244 real 0m6.474s 00:33:22.244 user 0m6.762s 00:33:22.244 sys 0m0.351s 00:33:22.244 09:37:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:22.244 09:37:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:22.244 ************************************ 00:33:22.244 END TEST bdev_crypto_enomem 00:33:22.244 ************************************ 00:33:22.244 09:37:31 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:33:22.244 09:37:31 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:33:22.244 00:33:22.244 real 0m54.555s 00:33:22.244 user 1m33.526s 00:33:22.244 sys 0m6.442s 00:33:22.244 09:37:31 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:22.244 09:37:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:22.244 ************************************ 00:33:22.244 END TEST blockdev_crypto_sw 00:33:22.244 ************************************ 00:33:22.244 09:37:31 -- common/autotest_common.sh@1142 -- # return 0 00:33:22.244 09:37:31 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:22.244 09:37:31 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:22.244 09:37:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:22.244 09:37:31 -- common/autotest_common.sh@10 -- # set +x 00:33:22.244 ************************************ 00:33:22.244 START TEST blockdev_crypto_qat 00:33:22.244 ************************************ 00:33:22.244 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:22.502 * Looking for test storage... 00:33:22.502 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:22.502 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=282468 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 282468 00:33:22.503 09:37:31 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 282468 ']' 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:22.503 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:22.503 09:37:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:22.503 [2024-07-15 09:37:31.331919] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:22.503 [2024-07-15 09:37:31.332000] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid282468 ] 00:33:22.761 [2024-07-15 09:37:31.462088] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:22.761 [2024-07-15 09:37:31.559189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:23.326 09:37:32 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:23.326 09:37:32 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:33:23.326 09:37:32 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:23.326 09:37:32 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:33:23.326 09:37:32 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:33:23.326 09:37:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:23.326 09:37:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:23.326 [2024-07-15 09:37:32.269429] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:23.326 [2024-07-15 09:37:32.277464] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:23.584 [2024-07-15 09:37:32.285481] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:23.584 [2024-07-15 09:37:32.365077] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:26.111 true 00:33:26.111 true 00:33:26.111 true 00:33:26.111 true 00:33:26.111 Malloc0 00:33:26.111 Malloc1 00:33:26.111 Malloc2 00:33:26.111 Malloc3 00:33:26.111 [2024-07-15 09:37:34.732958] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:26.111 crypto_ram 00:33:26.111 [2024-07-15 09:37:34.740975] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:26.111 crypto_ram1 00:33:26.111 [2024-07-15 09:37:34.748995] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:26.111 crypto_ram2 00:33:26.111 [2024-07-15 09:37:34.757011] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:26.111 crypto_ram3 00:33:26.111 [ 00:33:26.111 { 00:33:26.111 "name": "Malloc1", 00:33:26.111 "aliases": [ 00:33:26.111 "71ec3478-fe2a-418f-935a-af65390cc582" 00:33:26.111 ], 00:33:26.111 "product_name": "Malloc disk", 00:33:26.111 "block_size": 512, 00:33:26.111 "num_blocks": 65536, 00:33:26.111 "uuid": "71ec3478-fe2a-418f-935a-af65390cc582", 00:33:26.111 "assigned_rate_limits": { 00:33:26.111 "rw_ios_per_sec": 0, 00:33:26.111 "rw_mbytes_per_sec": 0, 00:33:26.111 "r_mbytes_per_sec": 0, 00:33:26.111 "w_mbytes_per_sec": 0 00:33:26.111 }, 00:33:26.111 "claimed": true, 00:33:26.111 "claim_type": "exclusive_write", 00:33:26.111 "zoned": false, 00:33:26.111 "supported_io_types": { 00:33:26.111 "read": true, 00:33:26.111 "write": true, 00:33:26.111 "unmap": true, 00:33:26.111 "flush": true, 00:33:26.111 "reset": true, 00:33:26.111 "nvme_admin": false, 00:33:26.111 "nvme_io": false, 00:33:26.111 "nvme_io_md": false, 00:33:26.111 "write_zeroes": true, 00:33:26.111 "zcopy": true, 00:33:26.111 "get_zone_info": false, 00:33:26.111 "zone_management": false, 00:33:26.111 "zone_append": false, 00:33:26.111 "compare": false, 00:33:26.111 "compare_and_write": false, 00:33:26.111 "abort": true, 00:33:26.111 "seek_hole": false, 00:33:26.111 "seek_data": false, 00:33:26.111 "copy": true, 00:33:26.111 "nvme_iov_md": false 00:33:26.111 }, 00:33:26.111 "memory_domains": [ 00:33:26.111 { 00:33:26.111 "dma_device_id": "system", 00:33:26.111 "dma_device_type": 1 00:33:26.112 }, 00:33:26.112 { 00:33:26.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:26.112 "dma_device_type": 2 00:33:26.112 } 00:33:26.112 ], 00:33:26.112 "driver_specific": {} 00:33:26.112 } 00:33:26.112 ] 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ff794b6-dc14-5f58-a673-2513c0d5bb45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ff794b6-dc14-5f58-a673-2513c0d5bb45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b96367a7-8983-5691-9f46-c2bebffbec0a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b96367a7-8983-5691-9f46-c2bebffbec0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4a6ae263-562e-57df-bcab-608808950bd3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4a6ae263-562e-57df-bcab-608808950bd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5dbcefd-6184-533f-9702-f9f0939bf6bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5dbcefd-6184-533f-9702-f9f0939bf6bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:26.112 09:37:34 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 282468 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 282468 ']' 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 282468 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:26.112 09:37:34 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 282468 00:33:26.112 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:26.112 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:26.112 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 282468' 00:33:26.112 killing process with pid 282468 00:33:26.112 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 282468 00:33:26.112 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 282468 00:33:26.678 09:37:35 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:26.678 09:37:35 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:26.678 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:26.678 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:26.678 09:37:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:26.678 ************************************ 00:33:26.678 START TEST bdev_hello_world 00:33:26.678 ************************************ 00:33:26.678 09:37:35 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:26.936 [2024-07-15 09:37:35.721229] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:26.936 [2024-07-15 09:37:35.721357] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid283104 ] 00:33:27.194 [2024-07-15 09:37:35.919081] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:27.194 [2024-07-15 09:37:36.023827] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:27.194 [2024-07-15 09:37:36.045133] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:27.194 [2024-07-15 09:37:36.053159] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:27.194 [2024-07-15 09:37:36.061186] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:27.452 [2024-07-15 09:37:36.178437] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:29.981 [2024-07-15 09:37:38.395894] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:29.981 [2024-07-15 09:37:38.395975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:29.981 [2024-07-15 09:37:38.395990] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.981 [2024-07-15 09:37:38.403913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:29.981 [2024-07-15 09:37:38.403936] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:29.981 [2024-07-15 09:37:38.403949] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.981 [2024-07-15 09:37:38.411937] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:29.981 [2024-07-15 09:37:38.411955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:29.981 [2024-07-15 09:37:38.411966] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.981 [2024-07-15 09:37:38.419958] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:29.981 [2024-07-15 09:37:38.419975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:29.981 [2024-07-15 09:37:38.419986] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.981 [2024-07-15 09:37:38.497726] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:29.981 [2024-07-15 09:37:38.497772] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:29.981 [2024-07-15 09:37:38.497791] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:29.981 [2024-07-15 09:37:38.499065] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:29.981 [2024-07-15 09:37:38.499135] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:29.981 [2024-07-15 09:37:38.499158] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:29.981 [2024-07-15 09:37:38.499201] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:29.981 00:33:29.981 [2024-07-15 09:37:38.499220] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:29.981 00:33:29.981 real 0m3.263s 00:33:29.981 user 0m2.781s 00:33:29.981 sys 0m0.437s 00:33:29.981 09:37:38 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:29.981 09:37:38 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:29.981 ************************************ 00:33:29.981 END TEST bdev_hello_world 00:33:29.981 ************************************ 00:33:29.981 09:37:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:29.981 09:37:38 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:29.981 09:37:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:29.981 09:37:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:29.981 09:37:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:30.239 ************************************ 00:33:30.239 START TEST bdev_bounds 00:33:30.239 ************************************ 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=283484 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 283484' 00:33:30.239 Process bdevio pid: 283484 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 283484 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 283484 ']' 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:30.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:30.239 09:37:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:30.239 [2024-07-15 09:37:39.020977] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:30.239 [2024-07-15 09:37:39.021040] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid283484 ] 00:33:30.239 [2024-07-15 09:37:39.148167] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:30.497 [2024-07-15 09:37:39.254051] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:30.497 [2024-07-15 09:37:39.254134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:30.497 [2024-07-15 09:37:39.254140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:30.497 [2024-07-15 09:37:39.275580] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:30.497 [2024-07-15 09:37:39.283606] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:30.497 [2024-07-15 09:37:39.291641] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:30.497 [2024-07-15 09:37:39.394169] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:33.075 [2024-07-15 09:37:41.595933] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:33.075 [2024-07-15 09:37:41.596014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:33.075 [2024-07-15 09:37:41.596035] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.075 [2024-07-15 09:37:41.603954] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:33.075 [2024-07-15 09:37:41.603974] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:33.075 [2024-07-15 09:37:41.603986] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.075 [2024-07-15 09:37:41.611975] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:33.075 [2024-07-15 09:37:41.611993] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:33.075 [2024-07-15 09:37:41.612005] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.075 [2024-07-15 09:37:41.619999] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:33.075 [2024-07-15 09:37:41.620015] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:33.075 [2024-07-15 09:37:41.620027] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.075 09:37:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:33.075 09:37:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:33.075 09:37:41 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:33.075 I/O targets: 00:33:33.075 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:33.075 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:33:33.075 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:33:33.075 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:33.075 00:33:33.075 00:33:33.075 CUnit - A unit testing framework for C - Version 2.1-3 00:33:33.075 http://cunit.sourceforge.net/ 00:33:33.075 00:33:33.075 00:33:33.075 Suite: bdevio tests on: crypto_ram3 00:33:33.075 Test: blockdev write read block ...passed 00:33:33.075 Test: blockdev write zeroes read block ...passed 00:33:33.075 Test: blockdev write zeroes read no split ...passed 00:33:33.075 Test: blockdev write zeroes read split ...passed 00:33:33.075 Test: blockdev write zeroes read split partial ...passed 00:33:33.075 Test: blockdev reset ...passed 00:33:33.075 Test: blockdev write read 8 blocks ...passed 00:33:33.075 Test: blockdev write read size > 128k ...passed 00:33:33.075 Test: blockdev write read invalid size ...passed 00:33:33.075 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:33.075 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:33.075 Test: blockdev write read max offset ...passed 00:33:33.075 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:33.075 Test: blockdev writev readv 8 blocks ...passed 00:33:33.075 Test: blockdev writev readv 30 x 1block ...passed 00:33:33.075 Test: blockdev writev readv block ...passed 00:33:33.075 Test: blockdev writev readv size > 128k ...passed 00:33:33.075 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:33.075 Test: blockdev comparev and writev ...passed 00:33:33.075 Test: blockdev nvme passthru rw ...passed 00:33:33.075 Test: blockdev nvme passthru vendor specific ...passed 00:33:33.075 Test: blockdev nvme admin passthru ...passed 00:33:33.075 Test: blockdev copy ...passed 00:33:33.075 Suite: bdevio tests on: crypto_ram2 00:33:33.075 Test: blockdev write read block ...passed 00:33:33.075 Test: blockdev write zeroes read block ...passed 00:33:33.075 Test: blockdev write zeroes read no split ...passed 00:33:33.075 Test: blockdev write zeroes read split ...passed 00:33:33.075 Test: blockdev write zeroes read split partial ...passed 00:33:33.075 Test: blockdev reset ...passed 00:33:33.075 Test: blockdev write read 8 blocks ...passed 00:33:33.075 Test: blockdev write read size > 128k ...passed 00:33:33.075 Test: blockdev write read invalid size ...passed 00:33:33.075 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:33.075 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:33.075 Test: blockdev write read max offset ...passed 00:33:33.075 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:33.075 Test: blockdev writev readv 8 blocks ...passed 00:33:33.075 Test: blockdev writev readv 30 x 1block ...passed 00:33:33.075 Test: blockdev writev readv block ...passed 00:33:33.075 Test: blockdev writev readv size > 128k ...passed 00:33:33.075 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:33.075 Test: blockdev comparev and writev ...passed 00:33:33.075 Test: blockdev nvme passthru rw ...passed 00:33:33.075 Test: blockdev nvme passthru vendor specific ...passed 00:33:33.075 Test: blockdev nvme admin passthru ...passed 00:33:33.075 Test: blockdev copy ...passed 00:33:33.075 Suite: bdevio tests on: crypto_ram1 00:33:33.075 Test: blockdev write read block ...passed 00:33:33.075 Test: blockdev write zeroes read block ...passed 00:33:33.075 Test: blockdev write zeroes read no split ...passed 00:33:33.075 Test: blockdev write zeroes read split ...passed 00:33:33.075 Test: blockdev write zeroes read split partial ...passed 00:33:33.075 Test: blockdev reset ...passed 00:33:33.075 Test: blockdev write read 8 blocks ...passed 00:33:33.075 Test: blockdev write read size > 128k ...passed 00:33:33.075 Test: blockdev write read invalid size ...passed 00:33:33.075 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:33.075 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:33.075 Test: blockdev write read max offset ...passed 00:33:33.075 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:33.075 Test: blockdev writev readv 8 blocks ...passed 00:33:33.075 Test: blockdev writev readv 30 x 1block ...passed 00:33:33.075 Test: blockdev writev readv block ...passed 00:33:33.075 Test: blockdev writev readv size > 128k ...passed 00:33:33.075 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:33.075 Test: blockdev comparev and writev ...passed 00:33:33.075 Test: blockdev nvme passthru rw ...passed 00:33:33.075 Test: blockdev nvme passthru vendor specific ...passed 00:33:33.075 Test: blockdev nvme admin passthru ...passed 00:33:33.075 Test: blockdev copy ...passed 00:33:33.075 Suite: bdevio tests on: crypto_ram 00:33:33.075 Test: blockdev write read block ...passed 00:33:33.075 Test: blockdev write zeroes read block ...passed 00:33:33.075 Test: blockdev write zeroes read no split ...passed 00:33:33.334 Test: blockdev write zeroes read split ...passed 00:33:33.334 Test: blockdev write zeroes read split partial ...passed 00:33:33.334 Test: blockdev reset ...passed 00:33:33.334 Test: blockdev write read 8 blocks ...passed 00:33:33.334 Test: blockdev write read size > 128k ...passed 00:33:33.334 Test: blockdev write read invalid size ...passed 00:33:33.334 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:33.334 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:33.334 Test: blockdev write read max offset ...passed 00:33:33.334 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:33.334 Test: blockdev writev readv 8 blocks ...passed 00:33:33.334 Test: blockdev writev readv 30 x 1block ...passed 00:33:33.334 Test: blockdev writev readv block ...passed 00:33:33.334 Test: blockdev writev readv size > 128k ...passed 00:33:33.334 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:33.334 Test: blockdev comparev and writev ...passed 00:33:33.334 Test: blockdev nvme passthru rw ...passed 00:33:33.334 Test: blockdev nvme passthru vendor specific ...passed 00:33:33.334 Test: blockdev nvme admin passthru ...passed 00:33:33.334 Test: blockdev copy ...passed 00:33:33.334 00:33:33.334 Run Summary: Type Total Ran Passed Failed Inactive 00:33:33.334 suites 4 4 n/a 0 0 00:33:33.334 tests 92 92 92 0 0 00:33:33.334 asserts 520 520 520 0 n/a 00:33:33.334 00:33:33.334 Elapsed time = 0.511 seconds 00:33:33.334 0 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 283484 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 283484 ']' 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 283484 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 283484 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 283484' 00:33:33.334 killing process with pid 283484 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 283484 00:33:33.334 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 283484 00:33:33.592 09:37:42 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:33.592 00:33:33.592 real 0m3.582s 00:33:33.592 user 0m9.939s 00:33:33.592 sys 0m0.547s 00:33:33.592 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:33.592 09:37:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:33.592 ************************************ 00:33:33.592 END TEST bdev_bounds 00:33:33.592 ************************************ 00:33:33.850 09:37:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:33.851 09:37:42 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:33.851 09:37:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:33.851 09:37:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:33.851 09:37:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:33.851 ************************************ 00:33:33.851 START TEST bdev_nbd 00:33:33.851 ************************************ 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=284031 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 284031 /var/tmp/spdk-nbd.sock 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 284031 ']' 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:33.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:33.851 09:37:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:33.851 [2024-07-15 09:37:42.734472] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:33:33.851 [2024-07-15 09:37:42.734607] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:34.109 [2024-07-15 09:37:42.931380] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:34.109 [2024-07-15 09:37:43.032108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:34.109 [2024-07-15 09:37:43.053405] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:34.109 [2024-07-15 09:37:43.061432] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:34.367 [2024-07-15 09:37:43.069448] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:34.367 [2024-07-15 09:37:43.181308] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:36.894 [2024-07-15 09:37:45.390915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:36.894 [2024-07-15 09:37:45.390983] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:36.894 [2024-07-15 09:37:45.390998] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:36.894 [2024-07-15 09:37:45.398940] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:36.894 [2024-07-15 09:37:45.398959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:36.894 [2024-07-15 09:37:45.398971] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:36.894 [2024-07-15 09:37:45.406957] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:36.894 [2024-07-15 09:37:45.406974] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:36.894 [2024-07-15 09:37:45.406985] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:36.894 [2024-07-15 09:37:45.414979] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:36.894 [2024-07-15 09:37:45.414995] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:36.894 [2024-07-15 09:37:45.415007] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:36.894 1+0 records in 00:33:36.894 1+0 records out 00:33:36.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290405 s, 14.1 MB/s 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:36.894 09:37:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:37.152 1+0 records in 00:33:37.152 1+0 records out 00:33:37.152 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000347322 s, 11.8 MB/s 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:37.152 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:37.410 1+0 records in 00:33:37.410 1+0 records out 00:33:37.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330768 s, 12.4 MB/s 00:33:37.410 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:37.667 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:37.668 1+0 records in 00:33:37.668 1+0 records out 00:33:37.668 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296464 s, 13.8 MB/s 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:37.668 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd0", 00:33:37.925 "bdev_name": "crypto_ram" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd1", 00:33:37.925 "bdev_name": "crypto_ram1" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd2", 00:33:37.925 "bdev_name": "crypto_ram2" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd3", 00:33:37.925 "bdev_name": "crypto_ram3" 00:33:37.925 } 00:33:37.925 ]' 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd0", 00:33:37.925 "bdev_name": "crypto_ram" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd1", 00:33:37.925 "bdev_name": "crypto_ram1" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd2", 00:33:37.925 "bdev_name": "crypto_ram2" 00:33:37.925 }, 00:33:37.925 { 00:33:37.925 "nbd_device": "/dev/nbd3", 00:33:37.925 "bdev_name": "crypto_ram3" 00:33:37.925 } 00:33:37.925 ]' 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:37.925 09:37:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:38.183 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:38.439 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:38.696 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:38.983 09:37:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:39.240 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:39.240 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:39.240 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:39.497 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:39.497 /dev/nbd0 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:39.754 1+0 records in 00:33:39.754 1+0 records out 00:33:39.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000389666 s, 10.5 MB/s 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:39.754 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:33:40.011 /dev/nbd1 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:40.011 1+0 records in 00:33:40.011 1+0 records out 00:33:40.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247594 s, 16.5 MB/s 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:40.011 09:37:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:33:40.268 /dev/nbd10 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:40.268 1+0 records in 00:33:40.268 1+0 records out 00:33:40.268 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329776 s, 12.4 MB/s 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:40.268 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:33:40.525 /dev/nbd11 00:33:40.525 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:40.525 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:40.525 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:40.526 1+0 records in 00:33:40.526 1+0 records out 00:33:40.526 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375105 s, 10.9 MB/s 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:40.526 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:40.784 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd0", 00:33:40.784 "bdev_name": "crypto_ram" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd1", 00:33:40.784 "bdev_name": "crypto_ram1" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd10", 00:33:40.784 "bdev_name": "crypto_ram2" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd11", 00:33:40.784 "bdev_name": "crypto_ram3" 00:33:40.784 } 00:33:40.784 ]' 00:33:40.784 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd0", 00:33:40.784 "bdev_name": "crypto_ram" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd1", 00:33:40.784 "bdev_name": "crypto_ram1" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd10", 00:33:40.784 "bdev_name": "crypto_ram2" 00:33:40.784 }, 00:33:40.784 { 00:33:40.784 "nbd_device": "/dev/nbd11", 00:33:40.785 "bdev_name": "crypto_ram3" 00:33:40.785 } 00:33:40.785 ]' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:40.785 /dev/nbd1 00:33:40.785 /dev/nbd10 00:33:40.785 /dev/nbd11' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:40.785 /dev/nbd1 00:33:40.785 /dev/nbd10 00:33:40.785 /dev/nbd11' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:40.785 256+0 records in 00:33:40.785 256+0 records out 00:33:40.785 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107605 s, 97.4 MB/s 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:40.785 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:41.044 256+0 records in 00:33:41.044 256+0 records out 00:33:41.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0837947 s, 12.5 MB/s 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:41.044 256+0 records in 00:33:41.044 256+0 records out 00:33:41.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0619675 s, 16.9 MB/s 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:41.044 256+0 records in 00:33:41.044 256+0 records out 00:33:41.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0464253 s, 22.6 MB/s 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:41.044 256+0 records in 00:33:41.044 256+0 records out 00:33:41.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0536847 s, 19.5 MB/s 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:41.044 09:37:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:41.301 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:41.302 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:41.559 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:41.816 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:42.074 09:37:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:42.333 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:42.591 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:42.591 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:42.591 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:42.592 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:42.849 malloc_lvol_verify 00:33:42.849 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:43.107 4e90f0a9-b0f2-49a8-9142-c6f01af7211a 00:33:43.107 09:37:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:43.364 5a8bb6fb-6f49-42a6-88f8-691d97fe88d5 00:33:43.364 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:43.621 /dev/nbd0 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:43.621 mke2fs 1.46.5 (30-Dec-2021) 00:33:43.621 Discarding device blocks: 0/4096 done 00:33:43.621 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:43.621 00:33:43.621 Allocating group tables: 0/1 done 00:33:43.621 Writing inode tables: 0/1 done 00:33:43.621 Creating journal (1024 blocks): done 00:33:43.621 Writing superblocks and filesystem accounting information: 0/1 done 00:33:43.621 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:43.621 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 284031 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 284031 ']' 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 284031 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 284031 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 284031' 00:33:43.909 killing process with pid 284031 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 284031 00:33:43.909 09:37:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 284031 00:33:44.168 09:37:53 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:44.168 00:33:44.168 real 0m10.460s 00:33:44.168 user 0m13.472s 00:33:44.168 sys 0m4.211s 00:33:44.168 09:37:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:44.168 09:37:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:44.168 ************************************ 00:33:44.168 END TEST bdev_nbd 00:33:44.168 ************************************ 00:33:44.428 09:37:53 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:44.428 09:37:53 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:44.428 09:37:53 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:33:44.428 09:37:53 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:33:44.428 09:37:53 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:44.428 09:37:53 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:44.428 09:37:53 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.428 09:37:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:44.428 ************************************ 00:33:44.428 START TEST bdev_fio 00:33:44.428 ************************************ 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:44.428 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:44.428 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:44.429 ************************************ 00:33:44.429 START TEST bdev_fio_rw_verify 00:33:44.429 ************************************ 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:44.429 09:37:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:44.687 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:44.687 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:44.687 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:44.687 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:44.687 fio-3.35 00:33:44.687 Starting 4 threads 00:33:59.623 00:33:59.624 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=286056: Mon Jul 15 09:38:06 2024 00:33:59.624 read: IOPS=20.0k, BW=78.1MiB/s (81.9MB/s)(781MiB/10001msec) 00:33:59.624 slat (usec): min=11, max=406, avg=69.86, stdev=48.30 00:33:59.624 clat (usec): min=23, max=2173, avg=375.16, stdev=294.26 00:33:59.624 lat (usec): min=50, max=2361, avg=445.01, stdev=328.43 00:33:59.624 clat percentiles (usec): 00:33:59.624 | 50.000th=[ 285], 99.000th=[ 1516], 99.900th=[ 1909], 99.990th=[ 2073], 00:33:59.624 | 99.999th=[ 2147] 00:33:59.624 write: IOPS=21.9k, BW=85.7MiB/s (89.9MB/s)(836MiB/9760msec); 0 zone resets 00:33:59.624 slat (usec): min=18, max=1221, avg=81.80, stdev=52.21 00:33:59.624 clat (usec): min=20, max=2318, avg=417.26, stdev=321.29 00:33:59.624 lat (usec): min=47, max=2597, avg=499.06, stdev=358.71 00:33:59.624 clat percentiles (usec): 00:33:59.624 | 50.000th=[ 326], 99.000th=[ 1713], 99.900th=[ 2114], 99.990th=[ 2245], 00:33:59.624 | 99.999th=[ 2311] 00:33:59.624 bw ( KiB/s): min=67480, max=106536, per=97.65%, avg=85690.68, stdev=2572.84, samples=76 00:33:59.624 iops : min=16870, max=26634, avg=21422.63, stdev=643.22, samples=76 00:33:59.624 lat (usec) : 50=0.02%, 100=4.65%, 250=33.54%, 500=37.77%, 750=13.41% 00:33:59.624 lat (usec) : 1000=5.47% 00:33:59.624 lat (msec) : 2=5.01%, 4=0.12% 00:33:59.624 cpu : usr=99.53%, sys=0.00%, ctx=44, majf=0, minf=241 00:33:59.624 IO depths : 1=4.8%, 2=27.2%, 4=54.4%, 8=13.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:59.624 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:59.624 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:59.624 issued rwts: total=200032,214115,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:59.624 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:59.624 00:33:59.624 Run status group 0 (all jobs): 00:33:59.624 READ: bw=78.1MiB/s (81.9MB/s), 78.1MiB/s-78.1MiB/s (81.9MB/s-81.9MB/s), io=781MiB (819MB), run=10001-10001msec 00:33:59.624 WRITE: bw=85.7MiB/s (89.9MB/s), 85.7MiB/s-85.7MiB/s (89.9MB/s-89.9MB/s), io=836MiB (877MB), run=9760-9760msec 00:33:59.624 00:33:59.624 real 0m13.512s 00:33:59.624 user 0m45.665s 00:33:59.624 sys 0m0.482s 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:59.624 ************************************ 00:33:59.624 END TEST bdev_fio_rw_verify 00:33:59.624 ************************************ 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ff794b6-dc14-5f58-a673-2513c0d5bb45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ff794b6-dc14-5f58-a673-2513c0d5bb45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b96367a7-8983-5691-9f46-c2bebffbec0a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b96367a7-8983-5691-9f46-c2bebffbec0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4a6ae263-562e-57df-bcab-608808950bd3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4a6ae263-562e-57df-bcab-608808950bd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5dbcefd-6184-533f-9702-f9f0939bf6bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5dbcefd-6184-533f-9702-f9f0939bf6bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:59.624 crypto_ram1 00:33:59.624 crypto_ram2 00:33:59.624 crypto_ram3 ]] 00:33:59.624 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9ff794b6-dc14-5f58-a673-2513c0d5bb45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ff794b6-dc14-5f58-a673-2513c0d5bb45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "b96367a7-8983-5691-9f46-c2bebffbec0a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b96367a7-8983-5691-9f46-c2bebffbec0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4a6ae263-562e-57df-bcab-608808950bd3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4a6ae263-562e-57df-bcab-608808950bd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5dbcefd-6184-533f-9702-f9f0939bf6bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5dbcefd-6184-533f-9702-f9f0939bf6bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:59.625 ************************************ 00:33:59.625 START TEST bdev_fio_trim 00:33:59.625 ************************************ 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:59.625 09:38:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:59.625 09:38:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:59.625 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:59.625 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:59.625 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:59.625 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:59.625 fio-3.35 00:33:59.625 Starting 4 threads 00:34:11.820 00:34:11.820 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=287903: Mon Jul 15 09:38:20 2024 00:34:11.820 write: IOPS=30.5k, BW=119MiB/s (125MB/s)(1192MiB/10001msec); 0 zone resets 00:34:11.820 slat (usec): min=17, max=1380, avg=77.80, stdev=45.11 00:34:11.820 clat (usec): min=28, max=1215, avg=274.32, stdev=164.28 00:34:11.820 lat (usec): min=47, max=1724, avg=352.12, stdev=191.78 00:34:11.820 clat percentiles (usec): 00:34:11.820 | 50.000th=[ 243], 99.000th=[ 857], 99.900th=[ 1020], 99.990th=[ 1090], 00:34:11.820 | 99.999th=[ 1139] 00:34:11.820 bw ( KiB/s): min=106008, max=160544, per=100.00%, avg=122549.47, stdev=4016.77, samples=76 00:34:11.820 iops : min=26498, max=40136, avg=30637.47, stdev=1004.17, samples=76 00:34:11.820 trim: IOPS=30.5k, BW=119MiB/s (125MB/s)(1192MiB/10001msec); 0 zone resets 00:34:11.820 slat (nsec): min=5456, max=94148, avg=20688.74, stdev=7866.93 00:34:11.820 clat (usec): min=47, max=1725, avg=352.32, stdev=191.84 00:34:11.820 lat (usec): min=54, max=1743, avg=373.01, stdev=194.68 00:34:11.820 clat percentiles (usec): 00:34:11.820 | 50.000th=[ 314], 99.000th=[ 1029], 99.900th=[ 1221], 99.990th=[ 1303], 00:34:11.820 | 99.999th=[ 1418] 00:34:11.820 bw ( KiB/s): min=105984, max=160544, per=100.00%, avg=122549.89, stdev=4016.67, samples=76 00:34:11.820 iops : min=26496, max=40136, avg=30637.47, stdev=1004.17, samples=76 00:34:11.820 lat (usec) : 50=0.04%, 100=5.09%, 250=37.49%, 500=45.66%, 750=7.73% 00:34:11.820 lat (usec) : 1000=3.24% 00:34:11.820 lat (msec) : 2=0.75% 00:34:11.820 cpu : usr=99.53%, sys=0.00%, ctx=64, majf=0, minf=113 00:34:11.820 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:11.820 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:11.820 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:11.820 issued rwts: total=0,305048,305048,0 short=0,0,0,0 dropped=0,0,0,0 00:34:11.820 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:11.820 00:34:11.820 Run status group 0 (all jobs): 00:34:11.820 WRITE: bw=119MiB/s (125MB/s), 119MiB/s-119MiB/s (125MB/s-125MB/s), io=1192MiB (1249MB), run=10001-10001msec 00:34:11.820 TRIM: bw=119MiB/s (125MB/s), 119MiB/s-119MiB/s (125MB/s-125MB/s), io=1192MiB (1249MB), run=10001-10001msec 00:34:11.820 00:34:11.820 real 0m13.553s 00:34:11.820 user 0m45.445s 00:34:11.820 sys 0m0.551s 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:11.820 ************************************ 00:34:11.820 END TEST bdev_fio_trim 00:34:11.820 ************************************ 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:11.820 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:11.820 00:34:11.820 real 0m27.414s 00:34:11.820 user 1m31.300s 00:34:11.820 sys 0m1.211s 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:11.820 ************************************ 00:34:11.820 END TEST bdev_fio 00:34:11.820 ************************************ 00:34:11.820 09:38:20 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:11.820 09:38:20 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:11.820 09:38:20 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:11.820 09:38:20 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:11.820 09:38:20 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:11.820 09:38:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.820 ************************************ 00:34:11.820 START TEST bdev_verify 00:34:11.820 ************************************ 00:34:11.820 09:38:20 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:11.820 [2024-07-15 09:38:20.714759] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:11.820 [2024-07-15 09:38:20.714820] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid289180 ] 00:34:12.078 [2024-07-15 09:38:20.842492] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:12.079 [2024-07-15 09:38:20.945098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:12.079 [2024-07-15 09:38:20.945105] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:12.079 [2024-07-15 09:38:20.966448] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:12.079 [2024-07-15 09:38:20.974479] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:12.079 [2024-07-15 09:38:20.982508] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:12.337 [2024-07-15 09:38:21.089116] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:14.866 [2024-07-15 09:38:23.290965] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:14.866 [2024-07-15 09:38:23.291058] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:14.866 [2024-07-15 09:38:23.291073] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.866 [2024-07-15 09:38:23.298980] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:14.866 [2024-07-15 09:38:23.299000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:14.866 [2024-07-15 09:38:23.299012] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.866 [2024-07-15 09:38:23.307004] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:14.866 [2024-07-15 09:38:23.307023] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:14.866 [2024-07-15 09:38:23.307035] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.866 [2024-07-15 09:38:23.315029] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:14.866 [2024-07-15 09:38:23.315047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:14.866 [2024-07-15 09:38:23.315058] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.866 Running I/O for 5 seconds... 00:34:20.128 00:34:20.128 Latency(us) 00:34:20.128 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:20.128 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x0 length 0x1000 00:34:20.128 crypto_ram : 5.07 481.50 1.88 0.00 0.00 264573.94 790.71 191479.10 00:34:20.128 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x1000 length 0x1000 00:34:20.128 crypto_ram : 5.07 487.32 1.90 0.00 0.00 261280.30 2194.03 191479.10 00:34:20.128 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x0 length 0x1000 00:34:20.128 crypto_ram1 : 5.07 485.86 1.90 0.00 0.00 261756.46 911.81 178713.82 00:34:20.128 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x1000 length 0x1000 00:34:20.128 crypto_ram1 : 5.07 490.30 1.92 0.00 0.00 259248.81 2364.99 178713.82 00:34:20.128 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x0 length 0x1000 00:34:20.128 crypto_ram2 : 5.06 3797.11 14.83 0.00 0.00 33401.23 5613.30 28607.89 00:34:20.128 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x1000 length 0x1000 00:34:20.128 crypto_ram2 : 5.06 3822.20 14.93 0.00 0.00 33183.50 5385.35 29063.79 00:34:20.128 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x0 length 0x1000 00:34:20.128 crypto_ram3 : 5.06 3794.43 14.82 0.00 0.00 33309.51 6781.55 29177.77 00:34:20.128 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.128 Verification LBA range: start 0x1000 length 0x1000 00:34:20.128 crypto_ram3 : 5.06 3819.54 14.92 0.00 0.00 33091.13 6610.59 28493.91 00:34:20.128 =================================================================================================================== 00:34:20.128 Total : 17178.26 67.10 0.00 0.00 59171.96 790.71 191479.10 00:34:20.128 00:34:20.128 real 0m8.224s 00:34:20.128 user 0m15.601s 00:34:20.128 sys 0m0.365s 00:34:20.128 09:38:28 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:20.128 09:38:28 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:20.128 ************************************ 00:34:20.128 END TEST bdev_verify 00:34:20.128 ************************************ 00:34:20.128 09:38:28 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:20.128 09:38:28 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:20.128 09:38:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:20.128 09:38:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:20.128 09:38:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:20.128 ************************************ 00:34:20.128 START TEST bdev_verify_big_io 00:34:20.128 ************************************ 00:34:20.128 09:38:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:20.128 [2024-07-15 09:38:29.013644] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:20.128 [2024-07-15 09:38:29.013708] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid290235 ] 00:34:20.386 [2024-07-15 09:38:29.144564] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:20.386 [2024-07-15 09:38:29.246766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:20.386 [2024-07-15 09:38:29.246772] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:20.386 [2024-07-15 09:38:29.268168] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:20.386 [2024-07-15 09:38:29.276197] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:20.386 [2024-07-15 09:38:29.284225] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:20.644 [2024-07-15 09:38:29.394098] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:23.171 [2024-07-15 09:38:31.597626] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:23.171 [2024-07-15 09:38:31.597716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:23.171 [2024-07-15 09:38:31.597731] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:23.171 [2024-07-15 09:38:31.605645] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:23.171 [2024-07-15 09:38:31.605666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:23.171 [2024-07-15 09:38:31.605679] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:23.171 [2024-07-15 09:38:31.613668] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:23.171 [2024-07-15 09:38:31.613687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:23.171 [2024-07-15 09:38:31.613699] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:23.171 [2024-07-15 09:38:31.621689] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:23.171 [2024-07-15 09:38:31.621709] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:23.171 [2024-07-15 09:38:31.621720] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:23.171 Running I/O for 5 seconds... 00:34:23.741 [2024-07-15 09:38:32.551359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.552816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.552895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.552948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.552994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.553043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.553371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.553390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.557372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.557434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.557480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.557545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.558670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.562736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.563198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.563218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.566610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.566669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.566711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.566766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.567767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.570980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.571705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.572134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.572155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.575450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.575520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.575563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.575605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.576655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.579815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.579862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.579903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.579948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.580968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.584209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.584256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.584297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.741 [2024-07-15 09:38:32.584340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.584790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.584836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.584879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.584922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.585315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.585334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.588546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.588595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.588637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.588683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.589784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.593656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.594080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.594100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.597894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.598325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.598343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.601683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.601732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.601774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.601816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.602875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.605890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.605941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.605983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.606026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.606492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.606537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.606581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.606623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.607027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.607047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.610988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.611432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.611452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.614483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.614531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.614572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.614614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.615706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.618772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.618819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.618861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.618903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.619882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.623373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.623431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.623502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.623559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.624721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.627705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.627752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.742 [2024-07-15 09:38:32.627793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.627865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.628981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.632731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.633138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.633163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.636976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.637410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.637430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.640969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.641011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.641401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.641419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.644456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.644504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.644545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.644587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.645649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.648593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.648641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.648685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.648728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.649715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.652497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.652544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.652586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.652628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.653667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.656562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.656610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.656651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.656693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.657724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.660545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.660591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.660635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.660680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.661631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.664377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.664426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.664470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.664513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.664973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.665048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.665096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.665160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.665556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.665576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.668514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.668561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.743 [2024-07-15 09:38:32.668602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.668645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.669557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.672460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.672522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.672572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.672629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.673718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.676452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.676499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.676540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.676582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.677673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.680773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.681043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.681062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.682903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.682957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.683958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.687802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.744 [2024-07-15 09:38:32.689252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.690725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.691483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.693039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.694565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.696083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.696575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.697021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.697041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.700964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.702523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.704160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.705059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.706993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.708527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.709972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.710362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.710795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.710814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.714713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.716253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.717295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.718933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.720790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.722319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.723053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.723446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.723872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.723891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.727665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.729193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.729946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.731228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.004 [2024-07-15 09:38:32.733207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.734821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.735214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.735600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.736006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.736027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.739739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.740924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.742377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.743664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.745467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.746363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.746764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.747153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.747627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.747646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.751286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.751995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.753294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.754821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.756748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.757153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.757542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.757936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.758368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.758388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.761634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.762940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.764231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.765753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.767118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.767509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.767896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.768291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.768727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.768746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.771205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.772628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.774204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.775741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.776412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.776805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.777199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.777593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.778039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.778059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.781014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.782307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.783838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.785366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.786133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.786524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.786910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.787300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.787607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.787625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.790789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.792313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.793845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.795213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.795996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.796388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.796777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.797457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.797738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.797757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.800802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.802328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.803846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.804440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.805285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.805676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.806070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.807513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.807829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.807847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.811258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.812798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.814259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.814650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.815463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.815855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.816368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.817707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.817986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.818005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.821257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.822789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.823636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.824033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.824851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.825248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.826478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.827766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.828046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.828065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.831496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.833136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.833528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.833919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.005 [2024-07-15 09:38:32.834685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.835247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.836540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.838068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.838346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.838365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.841671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.842504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.842901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.843293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.844164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.845505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.846798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.848315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.848601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.848619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.851989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.852382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.852775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.853168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.854219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.855516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.857048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.858583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.858858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.858877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.861384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.861779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.862173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.862563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.864415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.865704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.867229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.868754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.869245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.869264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.871264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.871657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.872053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.872443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.874070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.875598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.877118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.878258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.878539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.878557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.880692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.881091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.881482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.881966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.883598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.885126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.886701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.887498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.887815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.887834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.890013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.890404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.890796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.892106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.893969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.895503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.896554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.898180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.898499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.898518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.900920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.901321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.901910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.903209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.904993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.906613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.907460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.908750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.909032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.909051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.911406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.911799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.913149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.914435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.916239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.917270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.918899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.920362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.920640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.920658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.923262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.923811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.925111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.926628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.928500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.929353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.930648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.932172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.932448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.932466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.935070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.936417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.937716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.006 [2024-07-15 09:38:32.939242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.940516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.942214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.943759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.945395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.945673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.945691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.948814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.950109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.951606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.953097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.007 [2024-07-15 09:38:32.954246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.955491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.957000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.958533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.958836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.958855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.962457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.963740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.965260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.966776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.968649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.970004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.971517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.973034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.973434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.973453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.977574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.979289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.980906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.982449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.984122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.985652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.987178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.988418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.988841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.988860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.992360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.993899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.995446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.996251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.997800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:32.999331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.000863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.001390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.001910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.001934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.005528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.007073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.008693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.009567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.011552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.013103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.014582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.014974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.015408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.015427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.018886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.020145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.021211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.022491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.024245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.024639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.025041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.025434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.025855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.025874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.028755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.029156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.029557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.029606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.030434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.030826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.031220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.031613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.032068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.032088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.034788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.035190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.035588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.035984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.036034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.036405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.036801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.037200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.037598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.037998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.038418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.038438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.040720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.270 [2024-07-15 09:38:33.040768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.040811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.040854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.041910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.044397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.044443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.044485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.044526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.044980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.045578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.047788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.047835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.047876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.047919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.048364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.048418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.048464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.048506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.048549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.049013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.049033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.051971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.052013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.052428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.052450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.054743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.054791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.054834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.054877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.055946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.058342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.058406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.058448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.058491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.058975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.059585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.061865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.061916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.061962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.062619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.063008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.063027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.271 [2024-07-15 09:38:33.065951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.066018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.066074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.066128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.066570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.066589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.069754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.070159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.070178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.072906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.072968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.073755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.074163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.074182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.076461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.076509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.076551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.076594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.076961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.077668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.079979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.080815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.081255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.081278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.083469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.083519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.083561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.083603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.084715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.087913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.088318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.088338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.090580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.090628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.090670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.090713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.091847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.094837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.095268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.272 [2024-07-15 09:38:33.095288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.097462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.097509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.097554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.097608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.098706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.101918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.102273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.102293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.104650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.104697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.104738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.104783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.105838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.108964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.109308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.109326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.111709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.111758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.111827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.111870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.112906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.115990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.116032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.116074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.116559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.116578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.118896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.118947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.119688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.120148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.120168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.122500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.122560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.122605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.122648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.123693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.125825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.125883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.125934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.125981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.126251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.126316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.126370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.273 [2024-07-15 09:38:33.126414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.126456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.126891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.126909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.129859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.130134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.130153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.131809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.131854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.131894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.131939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.132800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.135504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.135551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.135595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.135636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.135957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.136436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.138959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.141904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.142182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.142201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.143910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.143960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.274 [2024-07-15 09:38:33.145952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.145994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.146409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.146430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.148942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.149002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.149044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.150855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.151128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.151147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.154323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.155389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.155777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.156170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.156650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.157053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.158635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.160039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.161572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.161846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.161864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.165129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.165530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.165915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.166305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.166727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.167577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.168869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.170401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.171939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.172232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.172256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.174756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.175165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.175555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.175944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.176387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.178067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.179552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.181106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.182732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.183233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.183252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.185140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.185530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.185918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.186312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.186621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.187917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.189434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.190952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.191957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.192234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.192253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.194299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.194693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.195086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.195639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.195911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.197421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.198992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.200653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.201593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.201909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.201931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.204012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.204401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.204791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.206170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.206487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.208032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.209560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.210502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.212185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.212461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.212479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.214733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.215126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.215716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.275 [2024-07-15 09:38:33.216989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.217261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.218842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.220461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.221326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.222608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.222881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.222899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.225177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.225574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.226898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.228186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.228457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.229976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.230979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.232626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.234101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.234374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.234393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.236939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.237502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.238790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.240311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.240584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.242229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.243119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.244399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.245931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.246205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.246224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.248823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.250268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.251552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.253080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.253352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.254251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.255845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.257467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.259170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.259442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.259460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.262616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.263909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.265446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.266973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.267248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.268401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.269684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.271200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.272720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.273078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.273098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.277209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.278704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.280259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.281894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.282414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.283779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.285299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.286815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.288213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.288629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.288648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.292139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.293668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.295202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.296131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.296406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.297694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.299225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.300745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.301331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.301801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.301821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.305451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.307133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.308791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.309804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.310127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.311673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.313201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.314495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.314886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.315324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.315344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.318846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.320384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.321282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.322906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.323187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.324721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.326262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.326735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.327129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.327538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.327557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.331302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.332920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.333958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.335231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.335505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.336978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.338314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.338701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.339105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.339533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.339552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.342890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.343767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.345376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.346994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.347268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.348819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.349338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.349726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.350123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.350574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.350594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.353942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.354957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.356228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.357742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.358022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.359337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.359730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.360122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.360513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.360944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.360965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.363420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.365066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.366655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.368319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.368593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.369171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.369565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.369961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.370348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.370717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.370736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.373198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.374460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.375945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.377438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.377780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.378196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.378588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.378983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.379378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.379652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.379671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.382870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.384431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.386067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.387782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.388175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.388586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.388980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.389367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.390325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.390634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.390653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.393526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.395052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.396586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.397488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.397948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.398354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.398743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.399137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.400727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.401007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.401026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.404246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.405952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.407583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.407981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.408417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.408813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.409206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.410201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.411477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.411754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.411772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.534 [2024-07-15 09:38:33.414870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.416387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.417271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.417677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.418122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.418524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.418914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.420575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.422149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.422422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.422441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.425748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.427353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.427744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.428139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.428541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.428949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.429993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.431261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.432790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.433071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.433090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.436206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.437050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.437439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.437830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.438297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.438699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.440271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.441919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.443610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.443882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.443901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.447240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.447635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.448025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.448416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.448841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.449495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.451039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.452546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.453589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.453969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.453992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.456082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.456478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.456869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.457284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.457627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.458043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.458438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.458827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.459237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.459607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.459625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.462322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.462722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.463126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.463519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.463953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.464355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.464747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.465168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.465577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.466035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.466057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.468762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.469165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.469555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.469949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.470331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.470742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.471141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.471533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.471922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.472366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.472386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.475021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.475416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.475465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.475857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.476241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.476646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.477043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.477432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.477820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.478220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.478241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.480898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.481302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.481697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.481748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.482193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.482592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.482986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.483376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.483776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.484170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.535 [2024-07-15 09:38:33.484190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.486577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.486645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.486701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.486755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.487886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.490940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.491404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.491423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.493696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.493744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.493814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.493870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.494853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.497817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.498244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.498263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.500540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.500588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.500629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.500670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.501731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.504815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.505281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.505301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.507590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.507637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.507682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.507724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.508740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.511824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.512211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.512230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.514619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.514665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.514706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.514748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.515767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.518850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.519294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.519313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.521620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.521667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.521709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.521751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.522735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.525997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.526053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.526095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.526485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.526504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.528811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.528859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.528901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.528949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.529322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.529392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.529435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.529493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.529536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.530003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.530023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.532982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.533024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.533065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.533484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.533503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.535795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.535841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.535884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.535932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.536419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.536474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.536519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.536562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.536607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.537030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.537049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.539445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.539492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.539537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.539580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.540653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.542922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.542975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.543680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.544146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.544169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.546518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.546575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.796 [2024-07-15 09:38:33.546617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.546659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.547682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.549436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.549482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.549523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.549569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.550656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.553763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.554146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.554171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.556676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.557165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.557184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.558731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.558778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.558820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.558863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.559896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.561997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.562884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.564469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.564516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.564560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.564601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.565656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.567757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.567805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.567845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.567886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.568635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.570879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.571291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.571311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.573527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.573572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.573613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.573654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.573965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.574443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.576707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.577064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.577084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.579628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.579679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.579725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.579767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.580519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.582744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.583020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.583040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.585434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.585482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.586522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.586571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.586880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.586947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.586990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.587032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.587072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.587340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.587358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.589066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.589115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.589165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.590685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.590964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.591542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.595004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.596541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.598074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.598908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.599188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.600479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.602006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.603530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.604026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.604492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.604512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.608152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.609814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.611523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.612479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.612801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.614332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.615849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.617240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.617634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.618079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.618099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.621609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.797 [2024-07-15 09:38:33.623134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.624152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.625810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.626113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.627658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.629191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.629795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.630188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.630618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.630637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.634109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.635727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.636612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.637887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.638168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.639721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.641130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.641522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.641915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.642315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.642335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.645699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.646716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.648358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.649813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.650101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.651641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.652282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.652672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.653062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.653551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.653572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.656789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.657624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.658913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.660436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.660713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.662206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.662600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.662994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.663386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.663821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.663841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.666572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.668177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.669614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.671144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.671419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.672142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.672533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.672924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.673319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.673713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.673732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.676105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.677399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.678935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.680454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.680788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.681203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.681598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.681997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.682390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.682670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.682689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.686075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.687696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.689398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.691050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.691425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.691829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.692224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.692616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.693662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.693980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.693999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.696857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.698371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.699900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.700764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.701232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.701638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.702039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.702432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.704076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.704379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.704397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.707741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.709397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.710950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.711342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.711765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.712173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.712566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.713695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.714987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.715265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.715283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.718394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.719946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.720602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.720999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.721442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.721859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.722258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.723843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.725527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.725806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.725825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.729072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.730357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.730748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.731143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.731573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.731980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.733369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.734653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.736168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.736445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.736469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.739618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.740132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.740523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.740916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.741373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.741777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.743332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:24.798 [2024-07-15 09:38:33.744975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.746640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.746915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.746938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.750175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.750573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.750966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.751357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.751796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.752987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.754268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.062 [2024-07-15 09:38:33.755788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.757310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.757709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.757729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.759950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.760349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.760740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.761139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.761572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.763190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.764821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.766530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.768136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.768495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.768513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.770526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.770924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.771316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.771706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.771988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.773282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.774794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.776201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.776945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.777222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.777241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.779224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.779621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.780018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.780650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.780932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.782532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.784225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.785827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.786894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.787225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.787245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.789445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.789839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.790236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.791805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.792086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.793628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.795158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.795859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.797139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.797414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.797433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.799707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.800112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.801064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.802345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.802621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.804178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.805515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.806825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.808099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.808375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.808394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.811042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.811519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.812897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.814439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.814716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.816323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.817174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.818461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.819973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.820254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.820273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.822800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.063 [2024-07-15 09:38:33.824008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.825305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.826842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.827123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.828162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.829805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.831265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.832797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.833078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.833097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.836317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.837609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.839139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.840669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.840959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.842132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.843421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.844954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.846485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.846875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.846895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.850816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.852097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.853529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.854249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.854526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.856029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.857503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.857902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.858304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.858690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.858709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.862160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.863426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.864467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.865757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.866037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.867589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.868815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.869216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.869609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.869968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.869989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.872617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.873016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.873409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.873806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.874209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.874618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.875016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.875419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.875813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.876207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.876228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.878898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.879304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.879704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.880102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.880516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.880930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.881320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.881733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.882148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.882597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.882617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.885259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.885660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.886058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.886446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.886858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.064 [2024-07-15 09:38:33.887271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.887675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.888069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.888472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.888906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.888932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.891659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.892061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.892456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.892853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.893266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.893670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.894064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.894455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.894849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.895225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.895245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.897984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.898379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.898769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.899165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.899612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.900018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.900416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.900813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.901209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.901643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.901663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.904339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.904733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.904782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.905173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.905608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.906014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.906427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.906828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.907223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.907703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.907723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.910329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.910741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.911138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.911191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.911600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.912013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.912411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.912803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.913204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.913647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.913668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.916770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.917116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.917136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.919435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.919482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.919524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.919566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.919997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.065 [2024-07-15 09:38:33.920592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.923693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.924076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.924096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.926488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.926540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.926583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.926638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.926989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.927654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.930788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.931266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.931288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.933585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.933632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.933685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.933728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.934898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.937937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.938364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.938384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.940746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.940793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.940855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.940899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.941941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.944350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.944398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.944439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.066 [2024-07-15 09:38:33.944482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.944913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.944976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.945020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.945063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.945113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.945545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.945565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.947865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.947913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.947961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.948609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.949042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.949063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.951978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.952022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.952067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.952504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.952524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.954904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.954958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.955693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.956064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.956083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.958389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.958438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.958480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.958523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.958950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.959463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.961562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.961610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.961654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.961697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.962698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.067 [2024-07-15 09:38:33.965828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.966245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.966264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.967809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.967863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.967909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.967957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.968763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.970794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.970842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.970884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.970931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.971937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.973517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.973562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.973603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.973644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.974582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.976407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.976456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.976498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.976541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.976988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.977691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.068 [2024-07-15 09:38:33.979853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.979897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.979947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.979989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.980267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.980286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.981999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.982747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.983130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.983150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.985628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.986126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.986146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.987761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.987808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.987851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.987892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.988991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.991754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.992078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.992108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.993726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.993773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.993821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.993864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.994866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.996861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.996907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.996953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.996994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.069 [2024-07-15 09:38:33.997738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.997757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.999488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.999534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.999575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.999615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:33.999998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.000678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.002822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.002872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.002921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.002966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.003713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.005978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.006021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.006406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.006426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.008680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.008730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.008771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.008812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.070 [2024-07-15 09:38:34.009649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.011376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.011427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.012889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.012942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.013825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.016043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.016089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.016131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.017892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.018168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.018187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.021585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.021995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.355 [2024-07-15 09:38:34.022389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.022783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.023221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.023784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.025085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.026569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.028038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.028318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.028338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.031201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.031604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.032008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.032401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.032852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.034135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.035430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.036958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.038483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.038849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.038868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.040875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.041278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.041672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.042070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.042448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.043800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.045305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.046832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.048219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.048531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.048550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.050597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.051000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.051393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.051787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.052063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.053350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.054880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.056409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.057123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.057396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.057415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.059503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.059901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.060301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.061261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.061568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.063127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.064659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.065972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.067309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.067667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.067686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.069894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.070297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.070688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.072364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.072639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.074185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.075714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.076422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.077729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.078006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.078026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.080413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.080811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.081715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.083002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.083277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.084825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.086224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.087468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.088754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.089035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.089054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.091585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.091989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.093662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.095171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.095446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.097004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.097714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.099065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.100603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.100876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.100895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.103473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.104469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.105763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.107298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.107573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.108880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.110255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.111530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.113055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.113329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.113348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.116040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.117722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.119310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.120975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.121250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.121988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.123261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.124792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.126336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.126609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.126629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.130028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.131313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.132826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.356 [2024-07-15 09:38:34.134335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.134665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.136040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.137342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.138859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.140385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.140782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.140801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.144983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.146378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.147900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.149437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.149833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.151362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.153030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.154624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.156134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.156518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.156538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.160059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.161603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.163141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.164420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.164748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.166047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.167579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.169108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.170040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.170484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.170504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.174010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.175539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.177070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.177774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.178051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.179520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.181062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.182669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.183072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.183527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.183547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.187203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.188738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.190187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.191385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.191703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.193234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.194762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.195877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.196275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.196702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.196722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.200242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.201790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.202522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.203983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.204258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.205798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.207332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.207731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.208132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.208538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.208561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.212057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.213550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.214722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.216008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.216284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.217832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.218956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.219351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.219740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.220172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.220191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.223551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.224374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.225890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.227561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.227836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.229369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.229768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.230168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.230563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.231011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.231031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.234319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.235390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.236680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.238204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.238483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.239724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.240122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.240514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.240907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.241339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.241360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.243794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.245360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.247103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.248744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.357 [2024-07-15 09:38:34.249023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.249468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.249861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.250256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.250653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.251036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.251056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.253706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.254984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.256518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.258052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.258384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.258792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.259187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.259581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.259982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.260293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.260312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.263573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.265107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.265510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.265903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.266302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.266708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.267209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.268568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.270118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.270392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.270411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.273702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.275124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.275518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.275909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.276300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.276709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.277108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.277510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.277906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.278350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.278371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.280983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.281388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.281783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.282180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.282532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.282944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.283339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.283731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.284134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.284573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.284598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.287137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.287539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.287937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.288332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.288733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.289143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.289538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.289935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.290329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.290679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.290700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.293430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.293830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.294229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.294623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.295012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.295418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.295814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.296214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.296611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.297049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.297070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.299693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.300095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.300485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.300874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.301230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.301643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.302049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.302439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.302836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.303281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.303303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.358 [2024-07-15 09:38:34.305902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.306308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.306702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.307104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.307489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.307894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.308294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.308683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.309086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.309457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.309477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.312148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.312547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.312948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.313341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.313739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.314150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.314545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.314943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.315338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.315783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.315803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.318839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.319894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.320293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.321290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.321578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.321997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.322394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.621 [2024-07-15 09:38:34.322790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.323831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.324128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.324148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.326697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.327101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.327153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.328269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.328584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.329000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.329865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.330862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.331257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.331642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.331661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.334637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.335670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.336067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.336116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.336501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.336907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.337851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.338779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.339179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.339507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.339527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.341776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.341825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.341866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.341911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.342880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.345962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.346007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.346418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.346437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.348496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.348543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.348584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.348627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.349649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.351965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.352573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.353031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.353052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.355871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.356153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.356172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.358453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.358501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.358546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.358590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.622 [2024-07-15 09:38:34.358861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.358913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.358968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.359021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.359065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.359523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.359544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.361718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.361767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.361831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.361888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.362859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.364891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.364944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.364991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.365651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.366010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.366030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.368957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.369371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.369392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.371996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.372041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.372084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.372141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.372408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.372428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.374819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.374884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.374933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.374975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.375897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.378976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.381165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.381219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.623 [2024-07-15 09:38:34.381260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.381770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.382180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.382201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.384490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.384539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.384583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.384627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.385630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.388827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.389207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.389227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.390846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.390893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.390944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.390990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.391750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.395977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.396897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.401968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.402026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.402079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.402589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.402609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.406736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.406798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.406840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.406881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.407641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.411597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.411658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.411703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.411744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.412689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.416958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.417005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.417047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.417089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.417395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.417415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.420203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.624 [2024-07-15 09:38:34.420254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.420812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.421085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.421105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.425650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.425702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.425744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.425787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.426850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.430984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.431035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.431414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.431461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.431510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.431782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.499499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.505491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.505557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.507053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.507115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.508550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.508610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.510234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.510508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.510527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.510543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.517026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.518558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.520081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.520388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.520408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.522768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.523171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.523560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.523959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.525973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.527569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.529245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.530902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.531251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.531271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.533278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.533677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.534073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.534464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.536015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.537545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.539085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.539817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.540096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.540115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.542229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.542623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.543018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.544082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.545947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.547472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.548769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.550133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.550454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.550473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.552771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.625 [2024-07-15 09:38:34.553176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.553706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.555032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.556814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.558381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.559212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.560494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.560768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.560788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.563131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.563527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.565026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.566337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.568127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.626 [2024-07-15 09:38:34.569029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.570659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.572255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.572527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.572546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.575092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.575984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.577278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.886 [2024-07-15 09:38:34.578808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.580578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.581739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.583031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.584556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.584828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.584847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.587529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.589022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.590654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.592213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.593208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.594499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.596025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.597559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.597831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.597851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.601678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.602973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.604494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.606017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.607197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.608451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.609961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.611477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.611750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.611770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.615443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.616723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.618266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.619791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.621947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.623472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.625078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.626813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.627223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.627244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.630815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.632280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.633808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.634967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.636529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.638072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.639601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.640446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.640915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.640940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.644339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.645877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.647409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.648113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.649950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.651625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.653292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.653688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.654115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.654136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.657713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.659229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.660421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.661881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.663704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.665235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.666138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.666543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.666978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.667001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.670689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.672226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.672949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.674242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.676092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.677801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.678202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.678593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.678991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.679011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.682373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.683554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.684844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.686268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.687750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.688445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.688837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.690265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.690692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.690712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.694027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.694886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.696488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.698119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.699938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.700492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.700882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.701280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.701719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.701740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.705070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.887 [2024-07-15 09:38:34.705775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.707046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.708605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.710558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.710963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.711361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.711746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.712200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.712221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.714820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.715218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.715614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.716015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.716832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.717228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.717617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.718014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.718399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.718421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.721255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.721655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.722066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.722458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.723331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.723738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.724142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.724535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.724992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.725013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.727827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.728235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.728628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.729028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.729915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.730312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.730707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.731106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.731525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.731545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.734254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.734651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.735054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.735448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.736257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.736649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.737049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.737442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.737827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.737847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.740543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.740945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.741337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.741727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.742554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.742954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.743346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.743736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.744133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.744153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.746806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.747203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.747594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.747994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.748774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.749174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.749567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.749969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.750327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.750347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.753054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.753117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.753508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.753910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.754775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.755181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.755571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.755981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.756452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.756474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.759300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.759373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.759768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.759819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.760660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.760709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.761104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.761154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.761538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.761557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.764177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.764229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.764619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.764670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.765443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.765493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.765877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.765923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.766379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.766399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.888 [2024-07-15 09:38:34.769832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.769894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.770297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.770354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.771245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.771298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.771685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.771731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.772209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.772229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.774935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.774987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.775373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.775422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.776184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.776238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.776624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.776669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.777127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.777148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.779750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.779805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.780195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.780247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.781114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.781162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.781560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.781616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.781996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.782017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.785133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.785187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.785581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.785632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.786455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.786503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.786894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.786947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.787357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.787377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.790050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.790104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.790494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.790545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.791300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.791352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.791736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.791782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.792214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.792235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.795094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.795148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.795205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.795243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.796098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.796160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.796204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.796589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.797026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.797050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.799447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.799494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.799535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.799577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.800531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.802903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.802958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.803967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.805623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.805671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.805712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.805753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.806521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.808693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.808740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.808781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.808826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.809280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.809327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.809370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.809412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.889 [2024-07-15 09:38:34.809709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.809727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.811982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.812254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.812273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.814968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.815011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.815053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.815486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.815506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.817698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.818006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.818026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.819860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.819908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.819956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.820997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.821017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.822696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.822742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.822783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.822824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.823620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.825963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.826007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.826050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.826463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.826483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.828960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.829320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.829339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.830945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.830991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.831032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.831546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.831592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.831638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:25.890 [2024-07-15 09:38:34.832082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.979550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.979624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.979994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.980050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.980418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.980851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.988287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.988355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.988721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.989116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.989507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.989845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.989863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.992582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.993878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.995400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.996929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.997696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.998092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.998495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.998885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.999166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:34.999186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.002302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.003989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.005600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.007154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.007944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.008334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.008724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.009711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.010028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.010047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.012916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.014451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.015976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.016842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.017677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.018071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.018457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.020161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.020439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.020458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.023587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.025199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.026915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.027317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.028143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.028533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.029402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.030684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.030966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.030985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.034077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.035606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.036642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.037033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.037841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.038240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.039783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.041151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.041429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.041447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.044595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.046127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.046530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.046923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.047687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.048446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.049731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.051258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.051535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.051554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.054725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.055805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.056203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.056591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.057408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.058586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.060106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.061636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.062021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.062040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.064041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.151 [2024-07-15 09:38:35.064435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.064825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.065617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.067583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.069258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.070865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.071915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.072228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.072247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.074349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.074742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.075138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.076685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.078578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.080110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.080907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.082429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.082737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.082755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.084917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.085321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.085714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.085776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.086631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.087030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.087420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.087813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.088227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.088247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.091027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.091078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.091468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.091520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.092285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.092336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.092724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.092769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.093197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.093217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.095796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.095849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.096243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.096295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.097126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.097180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.097578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.097628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.098102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.098123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.100782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.100836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.101228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.152 [2024-07-15 09:38:35.101618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.102424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.102485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.102886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.102951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.103377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.103396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.412 [2024-07-15 09:38:35.105827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.106236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.106626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.106673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.107492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.107883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.107936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.108325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.108700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.108719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.111346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.111746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.111796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.112209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.113030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.113080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.113469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.113867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.114259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.114278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.117286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.117344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.117740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.118139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.118605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.119000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.119396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.119460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.119867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.119886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.122337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.122734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.123135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.123187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.124006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.124396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.124441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.124827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.125188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.125209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.127842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.128245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.128297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.128693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.129516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.129567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.129978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.130371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.130744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.130765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.133503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.133557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.133957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.134357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.134843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.135239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.135630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.135674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.136082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.136112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.138501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.138896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.139293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.139688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.140488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.140877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.140924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.141317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.141686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.141705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.144017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.144410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.144456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.144855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.145279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.145694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.146090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.146138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.146624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.146643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.149198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.149250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.149637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.149682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.150492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.150556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.150952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.151001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.151437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.151457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.154091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.154141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.154527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.154574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.155380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.155429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.413 [2024-07-15 09:38:35.155818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.155868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.156274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.156293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.159262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.159334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.159726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.159776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.160625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.160678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.161070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.161115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.161513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.161533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.164226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.164280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.164665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.164715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.165383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.165435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.166748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.166795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.167183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.167201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.169869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.169922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.170313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.170358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.171185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.171233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.171620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.171674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.172045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.172064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.174737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.174790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.175193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.175250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.177855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.181307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.181359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.182879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.182930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.184561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.184610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.185940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.185988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.186262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.186281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.188755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.188805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.189554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.189602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.191502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.191551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.193071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.193117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.193389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.193407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.196116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.196165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.197780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.197834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.198664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.198719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.200382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.200433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.200871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.200890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.204069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.204120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.204170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.204210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.205865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.207393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.207441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.207482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.207752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.207770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.414 [2024-07-15 09:38:35.209988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.210036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.210077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.210122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.212587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.214856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.215130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.215148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.217915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.218339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.218359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.219996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.220973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.222647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.222692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.222733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.222774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.223879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.225682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.225728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.225769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.225810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.226621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.228754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.229241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.291424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.291491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.291853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.293869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.293916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.293973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.294662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.294994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.295045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.295098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.296606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.296657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.296932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.298934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.300144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.300192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.300234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.300731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.301104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.301149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.301197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.302689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.302972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.302991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.306329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.306382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.307904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.307954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.308267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.308655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.308701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.415 [2024-07-15 09:38:35.309095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.309531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.309550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.314748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.314811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.316489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.316539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.316845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.318555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.318602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.319626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.319973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.319992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.323073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.323124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.324405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.324452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.324766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.326299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.326347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.327104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.327377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.327395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.331683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.331737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.332555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.332602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.332954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.334480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.334527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.336052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.336348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.336366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.338922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.338975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.340679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.340731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.341231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.341622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.341670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.343127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.343583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.343602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.348609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.348673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.350191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.350239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.350557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.351725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.351771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.352170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.352614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.352633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.356203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.356254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.357770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.357817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.358249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.359584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.359631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.361124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.361399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.416 [2024-07-15 09:38:35.361418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.365493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.365548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.366768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.366814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.367171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.368704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.368751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.370261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.370619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.370642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.372796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.372847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.373240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.373278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.373689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.374086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.374131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.375208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.375547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.375565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.379984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.381520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.382299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.382348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.383815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.384280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.384338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.384382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.384883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.386264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.386705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.386725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.389908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.390682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.391969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.393497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.393770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.393833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.395182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.395572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.395969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.396386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.396404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.401457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.402883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.404403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.405955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.406360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.407975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.408365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.408934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.410231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.410674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.677 [2024-07-15 09:38:35.410694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.413909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.414762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.416049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.417579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.417853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.419333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.419724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.420119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.420509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.420939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.420961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.426833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.428373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.429894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.430547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.430820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.431234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.431757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.433105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.433496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.433899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.433918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.436227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.437522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.439041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.440367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.440642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.441068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.441462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.441853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.442249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.442531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.442550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.447794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.449323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.449946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.451303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.451775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.452178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.453687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.454079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.454813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.455107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.455126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.458252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.459721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.460116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.460170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.460612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.461013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.461405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.462842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.464129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.464402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.464421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.469670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.469726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.470962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.471353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.471728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.473140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.473530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.473578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.474587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.474900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.474919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.476545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.477825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.479359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.479407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.479683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.480621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.480686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.481075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.481461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.481918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.481942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.487210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.488775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.488825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.490348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.490625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.490691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.491474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.492558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.492605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.493038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.493059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.495448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.495501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.495886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.496279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.496719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.497122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.497514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.497581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.497980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.498254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.498273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.501445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.501514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.501902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.502292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.502745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.503150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.503201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.503594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.504227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.504508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.504527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.507076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.678 [2024-07-15 09:38:35.507136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.507532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.507923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.508382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.508782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.508829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.509221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.509615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.509995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.510014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.514400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.514456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.514848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.515249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.515692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.516093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.516141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.516525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.516917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.517284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.517303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.520015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.520067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.520451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.520840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.521223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.521628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.521683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.522074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.522463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.522896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.522915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.525955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.526014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.526402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.526808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.527203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.527608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.527658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.528050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.528443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.528857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.528876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.531584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.531635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.532758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.533152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.533553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.533974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.534031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.534414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.534802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.535205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.535225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.539758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.539815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.540483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.540875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.541257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.541661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.541712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.542104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.542492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.542983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.543003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.545563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.545614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.546006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.547616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.548140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.548539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.548596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.548989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.549386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.549847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.549870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.553088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.553152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.553887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.555026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.555474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.555878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.555941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.556335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.556725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.557156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.557176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.560008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.561179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.561232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.561618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.561957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.563115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.563164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.563546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.563940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.564317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.564337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.567353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.568760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.569211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.569260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.569686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.569741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.571339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.571739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.571788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.572222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.572242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.575049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.575100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.575487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.575546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.575965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.576369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.679 [2024-07-15 09:38:35.576419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.577736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.577782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.578244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.578269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.581901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.581960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.582346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.582390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.582829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.583236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.583288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.583677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.583727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.584014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.584034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.586567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.586618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.587011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.587081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.587501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.587902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.587955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.588342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.588386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.588857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.588877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.594735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.594799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.595197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.595242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.595639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.597269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.597342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.597859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.597914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.598190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.598209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.600952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.601006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.601395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.601445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.601733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.602712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.602760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.603149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.603193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.603470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.603488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.606935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.606992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.608516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.608566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.609046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.609454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.609518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.609910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.609970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.610355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.610374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.612667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.612718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.613420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.613467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.613797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.615368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.615417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.616933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.616980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.617256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.617274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.622352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.622408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.622798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.622845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.623132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.623537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.623585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.624177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.624225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.624526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.680 [2024-07-15 09:38:35.624545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.627436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.627490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.629019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.630542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.630856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.632145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.632195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.632598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.632647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.633095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.633115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.638483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.638534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.940 [2024-07-15 09:38:35.640038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.640096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.640457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.641742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.643274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.643322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.643364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.643635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.643653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.645704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.645751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.645793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.645835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.646109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.646769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.646816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.646859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.646902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.647332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.647351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.652808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.653075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.653094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.655907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.656341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.656371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.660996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.661267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.661286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.663865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.664272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.664291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.667899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.667959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.668832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.670526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.670574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.670621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.671995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.675900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.677666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.679201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.679249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.679650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.679668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.681969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.682021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.941 [2024-07-15 09:38:35.682066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.682786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.683087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.683162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.684681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.684729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.684771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.685050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.685069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.689549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.689626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.691281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.691329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.691794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.692198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.692248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.692290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.693593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.694082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.694101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.695825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.697885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.699145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.699193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.699465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.699484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.703259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.703661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.703707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.703751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.704156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.704215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.705494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.705541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.705582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.705856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.705874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.707522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.709057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.709104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.709145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.709417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.709478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.710063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.710110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.710153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.710425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.710444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.713097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.714628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.714681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.714721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.714997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.715059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.716231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.716277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.716319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.716590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.716609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.718266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.719769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.719813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.719855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.720295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.720350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.721071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.721119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.721160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.721500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.721519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.725099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.726627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.726683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.726725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.727009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.727067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.728597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.728651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.728699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.728976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.728999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.731081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.732127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.732174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.732218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.732643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.732700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.733937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.733984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.734024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.734378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.734397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.739077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.740785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.740832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.942 [2024-07-15 09:38:35.740874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.741215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.741273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.742213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.742259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.742306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.742738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.742757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.744758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.746039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.746085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.746125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.746399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.746460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.747992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.748040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.748088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.748476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.748494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.752880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.753280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.753328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.753372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.753653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.753709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.754345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.754392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.754436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.754869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.754888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.756448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.756493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.757235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.757282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.757590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.757652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.759181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.759228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.759268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.759539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.759557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.763206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.763270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.763313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.764974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.765251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.766780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.766829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.766870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.768506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.769054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.769073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.773554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.773959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.774006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.775287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.775697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.775759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.776154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.776200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.777840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.778148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.778166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.782810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.784489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.784537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.785578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.785916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.785982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.786374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.786419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.787678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.788094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.788114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.791719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.793291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.793343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.794885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.795161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.795217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.796891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.796942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.797977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.798300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.798319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.800889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.802424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.802472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.803984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.804394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.804457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.806146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.806192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.807860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.808140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.808158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.812053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.812451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.812497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.813667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.943 [2024-07-15 09:38:35.813982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.814044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.815560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.815607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.817137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.817490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.817508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.822396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.822795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.822842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.823647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.823933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.823995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.824388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.824434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.825583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.825899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.825917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.830576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.832107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.832155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.832713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.832997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.833061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.833450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.833496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.834268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.834548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.834567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.838186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.839476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.839525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.840819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.841101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.841164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.842696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.842743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.843329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.843607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.843626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.846266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.847813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.847862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.847903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.848183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.848244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.849524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.849572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.851093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.851415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.851434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.858484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.858890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.858943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.859616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.859895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.859963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.860011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.861515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.863035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.863311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.863330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.868416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.868823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.870299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.870696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.871129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.871186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.872802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.874273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.875799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.876079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.876098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.881305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.881704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.882657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.883571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.884023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.884957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.886243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.887767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.889298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.889644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:26.944 [2024-07-15 09:38:35.889664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.894794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.895201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.896703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.897096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.897510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.899052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.900711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.902276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.903763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.904130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.904148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.908634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.909903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.910504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.910896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.911181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.912481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.913989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.915508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.916198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.916478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.916497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.921084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.922392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.922782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.923401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.923679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.925285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.926842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.927557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.929022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.929298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.929317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.933142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.934490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.935770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.937297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.937571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.938569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.940257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.941810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.943437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.943715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.943733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.947786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.949081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.950599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.950649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.950932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.952285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.953607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.954887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.956424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.956701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.956719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.960150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.960207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.961657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.963293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.963571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.965123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.965573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.965623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.966910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.205 [2024-07-15 09:38:35.967190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.967208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.972288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.973632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.974027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.974073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.974435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.974840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.974891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.976154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.976763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.977198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.977222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.982041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.982441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.982492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.983767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.984175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.984238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.984630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.985027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.985078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.985473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.985492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.991126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.991184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.991578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.993275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.993755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.994168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.995700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.995759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.996154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.996589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.996609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.999559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:35.999613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.000009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.000409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.000809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.002515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.002564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.002956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.003589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.003868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.003890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.007474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.007530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.008688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.009081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.009504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.009915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.009979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.010775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.011839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.012276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.012296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.016824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.016880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.017270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.018290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.018612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.019022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.019071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.019463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.019858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.020205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.020223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.024440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.024497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.025722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.026348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.026793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.027853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.027904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.028534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.028935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.029312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.029331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.032252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.032306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.032698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.033103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.033385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.034018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.034068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.034453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.036172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.036667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.036686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.042567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.042627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.043028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.043421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.043866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.044280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.044334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.045820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.206 [2024-07-15 09:38:36.046214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.046642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.046661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.051108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.051166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.051554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.053069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.053527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.053934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.053992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.054395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.054889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.055171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.055190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.058341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.058405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.059968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.060359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.060790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.062340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.062397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.062789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.063184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.063684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.063703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.066819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.066881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.067283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.067716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.067999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.068407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.068458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.068848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.070348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.070819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.070839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.076647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.077068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.077118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.077510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.077902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.078311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.078369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.079845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.080235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.080653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.080671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.084870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.085278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.085671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.085721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.085998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.086062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.086453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.086847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.086904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.087392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.087413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.090752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.090822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.091224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.091282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.091655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.093170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.093219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.093604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.093650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.094028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.094047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.098103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.098161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.098865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.098914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.099200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.099608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.099655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.100048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.100100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.100506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.100524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.104909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.104969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.106250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.106297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.106609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.107931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.107980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.109281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.109330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.109772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.109791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.113714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.113771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.114196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.114244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.114669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.116028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.207 [2024-07-15 09:38:36.116080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.116473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.116520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.116949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.116970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.120161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.120220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.120613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.120664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.121122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.121531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.121585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.123096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.123144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.123594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.123614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.128098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.128156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.128544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.128590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.128937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.130978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.136281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.136339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.137854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.137903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.138364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.139887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.139941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.140327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.140372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.140738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.140757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.146632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.146693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.147977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.148028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.148346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.149897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.149951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.151488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.151556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.152092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.208 [2024-07-15 09:38:36.152120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.158143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.158212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.159746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.161280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.161726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.163319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.163397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.165065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.165118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.165400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.165420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.169240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.169293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.169687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.169733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.170029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.171311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.172833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.172882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.172924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.173203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.173223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.177270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.177322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.177363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.469 [2024-07-15 09:38:36.177405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.177804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.178656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.182910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.183235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.183259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.188980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.189022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.189468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.189488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.193526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.193577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.193618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.193668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.193953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.194446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.198959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.199004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.199047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.199094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.199369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.199388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.203620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.204038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.204059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.209547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.209600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.209650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.211835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.216435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.218740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.219331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.219380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.219653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.219673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.224474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.224532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.224574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.225969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.226249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.226313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.227841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.227891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.227946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.228326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.228345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.233143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.233194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.234456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.234506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.470 [2024-07-15 09:38:36.234781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.236332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.236382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.236423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.237264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.237544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.237563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.241563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.242874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.243277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.243325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.243602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.243622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.247224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.248849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.248899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.248946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.249222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.249284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.250408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.250457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.250498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.250939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.250960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.253500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.255016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.255070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.255120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.255394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.255450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.256375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.256422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.256464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.256780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.256800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.262071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.263763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.263820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.263862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.264339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.264395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.264970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.265018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.265060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.265370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.265389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.269951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.271493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.271544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.271584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.271994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.272058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.273519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.273565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.273606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.274047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.274067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.277752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.279295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.279345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.279385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.279800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.279861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.281591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.281643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.281699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.281987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.282008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.286607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.287647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.287693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.287735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.288161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.288221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.289482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.289529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.289570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.289913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.289937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.294493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.296065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.296115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.296156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.296576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.296631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.297027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.297074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.297118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.297591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.297613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.301254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.302541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.302591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.302633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.471 [2024-07-15 09:38:36.302909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.302981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.304503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.304555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.304595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.304945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.304966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.307730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.309334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.309397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.309443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.309716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.309779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.310854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.310902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.310949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.311260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.311280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.315300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.315353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.316018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.316067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.316433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.316497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.318031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.318082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.318122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.318395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.318414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.323079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.323135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.323178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.323566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.324017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.324647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.324695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.324736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.326027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.326305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.326325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.330793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.331565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.331615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.332006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.332423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.332481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.332871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.332918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.333750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.334084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.334104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.338321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.339858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.339909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.340685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.341172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.341230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.341617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.341669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.342061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.342503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.342523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.347389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.348932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.348987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.350516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.350844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.350908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.351307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.351354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.351742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.352230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.352251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.355846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.357138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.357188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.358706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.358991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.359056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.360353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.360400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.360788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.361212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.361233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.364703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.365517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.365566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.366948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.367227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.367286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.368973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.369023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.370425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.370818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.370839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.374348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.375882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.375935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.376736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.472 [2024-07-15 09:38:36.377018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.377076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.378506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.378556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.380240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.380522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.380541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.383400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.384944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.384994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.386509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.386817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.386882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.388302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.388350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.389766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.390049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.390068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.394402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.395817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.395865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.397382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.397663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.397727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.399201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.399250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.400650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.400973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.400992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.404965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.405365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.405414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.405456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.405731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.405793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.407397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.407447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.408972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.409246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.409267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.414151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.414549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.414597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.415042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.415325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.415395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.415466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.416945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.418373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.418702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.473 [2024-07-15 09:38:36.418722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.422773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.424275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.425615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.427123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.427400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.427466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.428121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.429451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.430988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.431269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.431289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.436666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.438204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.439732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.440695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.440979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.442268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.443793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.734 [2024-07-15 09:38:36.445313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.445924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.446407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.446427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.451966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.452674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.453951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.455484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.455761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.457482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.457883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.458278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.458667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.459117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.459137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.464319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.465941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.467478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.468844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.469250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.469655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.470053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.470440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.470834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.471182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.471202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.474657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.475058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.475471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.475865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.476240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.476648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.477051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.477442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.477833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.478288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.478308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.481889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.482304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.482699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.483093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.483519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.483933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.484335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.484728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.485124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.485541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.485561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.488153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.488571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.488968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.489019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.489396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.489800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.490200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.490591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.491006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.491442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.491464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.494259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.494316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.494712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.495120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.495567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.495981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.496373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.496421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.496810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.497243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.497263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.499510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.499905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.500305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.500358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.500700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.501113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.501164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.501552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.501948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.502394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.502414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.505037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.505433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.505490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.505881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.506339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.506422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.506814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.507208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.507254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.735 [2024-07-15 09:38:36.507685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.507705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.510489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.510541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.510931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.511323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.511712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.512126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.512514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.512562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.512957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.513395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.513416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.515991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.516044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.516429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.516825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.517213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.517621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.517671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.518069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.518462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.518901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.518922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.521662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.521715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.522110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.522510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.522956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.523363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.523414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.523798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.524193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.524591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.524611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.527261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.527315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.527701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.528109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.528525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.528934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.528985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.529369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.529762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.530164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.530184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.532902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.532969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.533365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.533766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.534253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.534659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.534711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.535102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.535497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.535923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.535949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.538708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.538761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.539157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.539551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.539999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.540400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.540449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.540836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.541238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.541623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.541643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.545221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.545274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.545747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.547381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.547804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.548220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.548272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.548660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.549052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.549469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.549488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.552260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.552312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.552706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.553104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.553518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.553921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.553976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.554365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.554754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.555153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.555173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.557813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.557866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.558263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.558662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.559140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.559544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.559595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.559985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.736 [2024-07-15 09:38:36.560716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.561013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.561033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.563933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.563985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.565509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.567037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.567358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.567767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.567816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.568207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.568606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.569053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.569073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.571359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.572655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.572705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.574226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.574506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.576006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.576055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.576437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.576824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.577242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.577262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.579160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.580680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.581386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.581433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.581742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.581807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.583338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.584873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.584920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.585198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.585217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.589061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.589112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.590504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.590552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.590827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.592380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.592440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.593228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.593280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.593595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.593614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.595716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.595767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.596156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.596202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.596655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.598104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.598153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.599592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.599639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.599917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.599941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.603149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.603202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.604494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.604542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.604945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.605346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.605392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.605779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.605825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.606274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.606296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.608542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.608592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.609875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.609922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.610199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.611790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.611847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.613128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.613175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.613580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.613599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.617109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.617160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.618695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.618743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.619020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.619766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.619814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.621088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.621136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.621408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.621427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.623798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.623848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.624241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.737 [2024-07-15 09:38:36.624293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.624566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.625872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.625922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.627450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.627506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.627779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.627798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.631056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.631108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.631497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.631548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.632008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.632410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.632457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.632840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.632902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.633180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.633200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.636291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.636345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.637857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.637905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.638186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.639737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.639785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.640177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.640224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.640659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.640678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.644275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.644327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.645850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.646540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.646817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.648418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.648474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.650027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.650075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.650347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.650367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.652812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.652864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.654341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.654388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.654683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.656232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.657759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.657806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.657847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.658255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.658275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.659841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.659889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.659936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.659979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.660366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.660773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.660819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.660861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.660907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.661373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.661397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.663759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.664053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.664073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.665732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.665794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.665836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.665879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.666938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.668816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.668861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.668902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.668948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.738 [2024-07-15 09:38:36.669813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.671383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.671437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.671479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.671520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.671967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.672617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.674850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.674914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.674961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.675792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.677362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.677408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.677454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.677848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.678923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.680849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.682372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.682437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.682480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.682919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.683000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.683044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:27.739 [2024-07-15 09:38:36.684287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.684347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.684695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.684724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.687460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.687513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.687556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.688371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.688721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.688786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.690294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.690342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.690383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.690656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.690675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.692386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.692432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.693921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.693973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.694373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.694806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.694858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.694903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.695299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.695719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.695740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.697407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.000 [2024-07-15 09:38:36.698935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.700567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.700615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.700888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.700908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.703939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.705256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.705305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.705346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.705625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.705644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.707337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.708870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.708919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.708967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.709301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.709362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.709753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.709799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.709845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.710301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.710321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.712556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.714235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.714285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.714326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.714597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.714660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.715864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.715912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.715957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.716301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.716321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.718226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.718618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.718665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.718708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.719144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.719200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.720105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.720153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.720195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.720508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.720527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.722165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.723519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.723567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.723618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.723897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.723961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.725572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.725627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.725675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.726072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.726092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.728471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.729759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.729807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.729848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.730125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.730190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.731717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.731765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.731814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.732192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.732211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.733765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.734167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.734214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.734257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.734685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.734741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.735139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.735186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.735231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.735660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.735682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.737249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.738044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.738091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.738133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.738452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.738514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.739981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.740028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.740068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.740337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.740356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.742617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.743026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.743075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.743117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.743405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.743481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.001 [2024-07-15 09:38:36.745022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.745070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.745112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.745383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.745402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.747101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.747147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.748672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.748720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.749155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.749223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.749615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.749661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.749705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.750084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.750105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.753518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.753569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.753610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.754644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.754923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.756226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.756275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.756317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.757829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.758111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.758131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.760543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.761281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.761329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.762617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.762897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.762965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.764496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.764543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.765554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.765828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.765848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.767596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.767997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.768044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.768433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.768891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.768954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.769708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.769755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.771029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.771306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.771325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.772974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.774504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.774553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.776081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.776548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.776624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.777022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.777069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.777452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.777907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.777932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.779628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.781037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.781083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.782511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.782827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.782890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.784427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.784475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.786008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.786438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.786457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.789032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.790657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.790704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.792165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.792441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.792506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.793235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.793283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.794808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.795085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.795109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.797390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.797785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.797833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.799174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.799447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.799505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.801194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.801252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.802631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.802961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.802981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.804615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.805015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.805063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.805450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.805961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.806017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.806417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.002 [2024-07-15 09:38:36.806464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.807751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.808030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.808050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.809801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.811325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.811373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.812902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.813241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.813306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.813694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.813745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.814135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.814633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.814654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.816369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.818063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.818117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.818160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.818483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.818540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.819823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.819870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.821398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.821674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.821693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.824356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.825887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.825946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.827493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.827768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.827833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.827879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.829250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.830177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.830491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.830510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.832583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.832977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.833367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.833760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.834120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.834191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.834582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.834976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.835367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.835820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.835840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.838615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.839016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.839419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.839814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.840271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.840672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.841064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.841458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.841857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.842230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.842251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.844920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.845317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.845709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.846101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.846546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.846952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.847350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.847744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.848138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.848574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.848593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.851175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.851570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.851973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.852372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.852738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.853150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.853542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.853938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.854328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.854822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.854841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.857640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.858047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.858447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.858836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.859210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.859612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.860011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.860408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.860796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.861228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.861248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.863884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.864280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.864670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.865065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.865442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.865846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.866240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.866630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.867025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.003 [2024-07-15 09:38:36.867468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.867489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.870136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.870532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.870933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.870987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.871381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.871785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.872180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.872571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.872975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.873457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.873476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.876324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.876381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.876772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.877169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.877611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.878016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.878411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.878461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.878848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.879241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.879261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.881594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.882000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.882397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.882450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.882951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.883352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.883399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.883787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.884194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.884618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.884638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.887434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.887832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.887884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.888274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.888713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.888772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.889165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.889555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.889601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.889982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.890002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.892627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.892686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.893080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.893480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.893952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.894357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.894744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.894793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.895187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.895628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.895648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.898428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.898487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.898878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.899275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.899721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.900124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.900177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.900561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.900959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.901361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.901382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.904723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.904787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.905445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.906890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.907306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.907719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.907769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.908155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.908541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.908950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.908970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.911756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.911807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.004 [2024-07-15 09:38:36.912201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.912595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.913054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.913457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.913506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.913891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.914281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.914643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.914663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.917333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.917393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.917786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.918186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.918674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.919081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.919131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.919520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.919910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.920372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.920392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.923505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.923558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.924827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.926372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.926654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.928000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.928050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.929559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.930916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.931195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.931215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.933667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.933718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.934454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.935741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.936023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.937651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.937699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.939070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.940345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.940665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.940684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.942803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.942857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.943251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.943645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.943917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.945206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.945257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.946772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.948300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.948690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.005 [2024-07-15 09:38:36.948709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.950708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.950775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.951170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.951561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.952006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.953330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.953378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.954680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.956192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.266 [2024-07-15 09:38:36.956470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.956489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.959638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.959692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.960100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.960520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.960890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.961304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.961355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.962426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.963722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.964001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.964026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.967142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.968684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.968734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.969245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.969716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.970121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.970170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.970554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.971315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.971615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.971634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.973291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.974575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.976112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.976159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.976433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.976497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.977421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.977829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.977880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.978327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.978347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.981881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.981939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.983456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.983505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.983964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.985337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.985386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.986920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.986971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.987242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.987261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.989761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.989814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.991243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.991291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.991606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.993143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.993191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.994701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.994747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.995140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.995159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.997082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.997135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.997520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.997567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.997923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.998334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.998386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.999505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.999552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.999877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:36.999896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.002750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.002802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.004325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.004371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.004643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.005334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.005382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.005767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.005815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.006209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.006228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.009867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.009920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.011328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.011375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.011690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.012985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.013033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.014561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.014608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.014881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.014901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.017511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.017563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.019240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.019285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.019559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.021096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.021146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.022783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.267 [2024-07-15 09:38:37.022838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.023205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.023225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.025153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.025205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.025598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.025644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.026095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.026498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.026545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.028186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.028231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.028503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.028523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.031824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.031877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.033394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.033441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.033717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.034129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.034179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.034567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.034612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.035025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.035045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.038408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.038460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.039240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.040767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.041050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.042588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.042638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.044241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.044297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.044666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.044686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.047102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.047148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.048435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.048483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.048754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.050316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.051473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.051522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.051572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.051846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.051865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.053524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.053573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.053613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.053655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.054113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.054516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.054568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.054611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.054653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.055071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.055092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.056739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.056785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.056825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.056866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.057674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.059956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.060001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.060043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.060085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.060532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.060551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.062941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.063288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.063307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.064897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.064948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.064990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.268 [2024-07-15 09:38:37.065702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.066129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.066150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.068961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.069316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.069335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.070888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.070939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.070984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.071373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.071830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.071886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.071937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.071981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.072023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.072413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.072434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.074372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.075912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.075965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.076006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.076382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.076447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.076509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.078066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.078131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.078407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.078426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.080759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.080812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.080854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.081251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.081602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.081660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.082950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.082999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.083040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.083314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.083333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.084953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.084998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.086517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.086563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.086836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.087460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.087509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.087549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.087940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.088326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.088346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.090271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.091806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.091855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.091896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.092249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.092312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.092356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.094003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.094058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.094333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.094353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.096248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.096643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.096692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.096734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.097180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.097238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.097978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.098026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.098066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.098403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.098423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.100060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.101344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.101392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.101433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.101706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.101767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.103304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.103352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.103401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.103775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.103802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.106274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.107554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.107602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.107643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.107915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.107983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.109515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.269 [2024-07-15 09:38:37.109563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.109603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.109923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.109948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.111500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.112164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.112228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.112271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.112721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.112776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.113173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.113224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.113267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.113712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.113731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.115411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.116754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.116802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.116845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.117127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.117186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.118452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.118506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.118547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.118820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.118839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.121939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.123215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.123264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.123305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.123581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.123600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.125243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.126767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.126815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.126855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.127134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.127194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.127583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.127634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.127676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.128126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.128146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.130282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.131816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.131864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.131905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.132183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.132250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.133435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.133484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.133532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.133811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.133831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.135477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.135873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.135919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.135970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.136349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.136404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.136793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.136839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.136882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.137208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.137227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.138843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.138890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.140570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.140625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.140900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.140963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.142489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.142539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.142590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.142864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.142884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.145880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.145937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.145984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.147243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.147520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.149026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.149076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.149120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.150623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.150942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.150961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.153065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.153462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.153508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.154032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.154310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.270 [2024-07-15 09:38:37.154373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.156066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.156122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.157646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.157930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.157949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.159633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.160632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.160689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.161088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.161523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.161582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.162004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.162056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.162525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.162806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.162826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.164490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.165768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.165817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.167342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.167621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.167682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.168901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.168957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.169341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.169779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.169800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.171979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.173537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.173593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.175174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.175566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.175629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.176896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.176950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.178464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.178740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.178759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.181027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.181102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.182447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.182498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.183762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.184043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:28.271 [2024-07-15 09:38:37.186332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:29.205 00:34:29.205 Latency(us) 00:34:29.205 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:29.205 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x0 length 0x100 00:34:29.205 crypto_ram : 6.11 41.88 2.62 0.00 0.00 2962394.60 317308.22 2567643.49 00:34:29.205 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x100 length 0x100 00:34:29.205 crypto_ram : 6.15 41.62 2.60 0.00 0.00 2986215.51 293601.28 2684354.56 00:34:29.205 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x0 length 0x100 00:34:29.205 crypto_ram1 : 6.11 41.87 2.62 0.00 0.00 2856913.70 315484.61 2348810.24 00:34:29.205 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x100 length 0x100 00:34:29.205 crypto_ram1 : 6.15 41.61 2.60 0.00 0.00 2880634.88 293601.28 2480110.19 00:34:29.205 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x0 length 0x100 00:34:29.205 crypto_ram2 : 5.67 269.55 16.85 0.00 0.00 423710.72 87989.20 667441.42 00:34:29.205 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x100 length 0x100 00:34:29.205 crypto_ram2 : 5.61 255.28 15.96 0.00 0.00 445918.96 34876.55 685677.52 00:34:29.205 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x0 length 0x100 00:34:29.205 crypto_ram3 : 5.77 278.18 17.39 0.00 0.00 396038.39 40575.33 496022.04 00:34:29.205 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:29.205 Verification LBA range: start 0x100 length 0x100 00:34:29.205 crypto_ram3 : 5.76 267.39 16.71 0.00 0.00 413201.80 37611.97 353780.42 00:34:29.205 =================================================================================================================== 00:34:29.205 Total : 1237.39 77.34 0.00 0.00 778564.32 34876.55 2684354.56 00:34:29.463 00:34:29.463 real 0m9.331s 00:34:29.463 user 0m17.691s 00:34:29.463 sys 0m0.474s 00:34:29.463 09:38:38 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:29.463 09:38:38 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:29.463 ************************************ 00:34:29.463 END TEST bdev_verify_big_io 00:34:29.463 ************************************ 00:34:29.463 09:38:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:29.463 09:38:38 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:29.463 09:38:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:29.463 09:38:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:29.463 09:38:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:29.463 ************************************ 00:34:29.463 START TEST bdev_write_zeroes 00:34:29.463 ************************************ 00:34:29.463 09:38:38 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:29.722 [2024-07-15 09:38:38.425613] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:29.722 [2024-07-15 09:38:38.425675] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid291472 ] 00:34:29.722 [2024-07-15 09:38:38.542315] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:29.722 [2024-07-15 09:38:38.642334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:29.722 [2024-07-15 09:38:38.663686] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:29.722 [2024-07-15 09:38:38.671719] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:29.980 [2024-07-15 09:38:38.679736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:29.980 [2024-07-15 09:38:38.791973] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:32.508 [2024-07-15 09:38:40.995875] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:32.508 [2024-07-15 09:38:40.995962] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:32.508 [2024-07-15 09:38:40.995979] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.508 [2024-07-15 09:38:41.003894] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:32.508 [2024-07-15 09:38:41.003916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:32.508 [2024-07-15 09:38:41.003936] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.508 [2024-07-15 09:38:41.011914] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:32.508 [2024-07-15 09:38:41.011943] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:32.508 [2024-07-15 09:38:41.011956] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.508 [2024-07-15 09:38:41.019941] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:32.508 [2024-07-15 09:38:41.019959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:32.508 [2024-07-15 09:38:41.019971] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.508 Running I/O for 1 seconds... 00:34:33.442 00:34:33.442 Latency(us) 00:34:33.442 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:33.442 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:33.442 crypto_ram : 1.02 2014.70 7.87 0.00 0.00 63051.13 5641.79 76135.74 00:34:33.442 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:33.442 crypto_ram1 : 1.03 2027.83 7.92 0.00 0.00 62342.42 5613.30 70664.90 00:34:33.442 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:33.442 crypto_ram2 : 1.02 15512.67 60.60 0.00 0.00 8125.87 2436.23 10656.72 00:34:33.442 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:33.442 crypto_ram3 : 1.02 15544.77 60.72 0.00 0.00 8083.30 2436.23 8434.20 00:34:33.442 =================================================================================================================== 00:34:33.442 Total : 35099.97 137.11 0.00 0.00 14419.43 2436.23 76135.74 00:34:33.700 00:34:33.700 real 0m4.150s 00:34:33.700 user 0m3.745s 00:34:33.700 sys 0m0.364s 00:34:33.700 09:38:42 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:33.700 09:38:42 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:33.700 ************************************ 00:34:33.700 END TEST bdev_write_zeroes 00:34:33.700 ************************************ 00:34:33.700 09:38:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:33.700 09:38:42 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:33.700 09:38:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:33.700 09:38:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:33.700 09:38:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:33.700 ************************************ 00:34:33.700 START TEST bdev_json_nonenclosed 00:34:33.700 ************************************ 00:34:33.700 09:38:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:33.700 [2024-07-15 09:38:42.642465] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:33.700 [2024-07-15 09:38:42.642526] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid292014 ] 00:34:33.957 [2024-07-15 09:38:42.768704] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:33.957 [2024-07-15 09:38:42.864949] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:33.957 [2024-07-15 09:38:42.865017] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:33.957 [2024-07-15 09:38:42.865037] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:33.957 [2024-07-15 09:38:42.865051] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:34.214 00:34:34.214 real 0m0.382s 00:34:34.214 user 0m0.222s 00:34:34.214 sys 0m0.157s 00:34:34.214 09:38:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:34.214 09:38:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:34.214 09:38:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:34.214 ************************************ 00:34:34.214 END TEST bdev_json_nonenclosed 00:34:34.214 ************************************ 00:34:34.214 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:34.214 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:34:34.214 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.214 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:34.214 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:34.214 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:34.214 ************************************ 00:34:34.214 START TEST bdev_json_nonarray 00:34:34.214 ************************************ 00:34:34.214 09:38:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.214 [2024-07-15 09:38:43.099623] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:34.214 [2024-07-15 09:38:43.099683] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid292035 ] 00:34:34.470 [2024-07-15 09:38:43.229236] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:34.470 [2024-07-15 09:38:43.333966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:34.470 [2024-07-15 09:38:43.334044] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:34.470 [2024-07-15 09:38:43.334064] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:34.470 [2024-07-15 09:38:43.334077] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:34.727 00:34:34.727 real 0m0.397s 00:34:34.727 user 0m0.234s 00:34:34.727 sys 0m0.161s 00:34:34.727 09:38:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:34.727 09:38:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:34.727 09:38:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:34.727 ************************************ 00:34:34.727 END TEST bdev_json_nonarray 00:34:34.727 ************************************ 00:34:34.727 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:34:34.727 09:38:43 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:34:34.727 00:34:34.727 real 1m12.351s 00:34:34.727 user 2m39.669s 00:34:34.727 sys 0m9.144s 00:34:34.727 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:34.727 09:38:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:34.727 ************************************ 00:34:34.727 END TEST blockdev_crypto_qat 00:34:34.727 ************************************ 00:34:34.727 09:38:43 -- common/autotest_common.sh@1142 -- # return 0 00:34:34.727 09:38:43 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:34.727 09:38:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:34.727 09:38:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:34.727 09:38:43 -- common/autotest_common.sh@10 -- # set +x 00:34:34.727 ************************************ 00:34:34.727 START TEST chaining 00:34:34.727 ************************************ 00:34:34.727 09:38:43 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:34.727 * Looking for test storage... 00:34:34.727 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:34.727 09:38:43 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:34:34.727 09:38:43 chaining -- nvmf/common.sh@7 -- # uname -s 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:35.000 09:38:43 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:35.000 09:38:43 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:35.000 09:38:43 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:35.000 09:38:43 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:35.000 09:38:43 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:35.000 09:38:43 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:35.000 09:38:43 chaining -- paths/export.sh@5 -- # export PATH 00:34:35.000 09:38:43 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@47 -- # : 0 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:34:35.000 09:38:43 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:35.000 09:38:43 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:35.000 09:38:43 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:35.000 09:38:43 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:35.000 09:38:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:41.577 09:38:50 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@336 -- # return 1 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:41.578 WARNING: No supported devices were found, fallback requested for tcp test 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:41.578 Cannot find device "nvmf_tgt_br" 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@155 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:41.578 Cannot find device "nvmf_tgt_br2" 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@156 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:41.578 Cannot find device "nvmf_tgt_br" 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@158 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:41.578 Cannot find device "nvmf_tgt_br2" 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@159 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:41.578 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@162 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:41.578 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@163 -- # true 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:41.578 09:38:50 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:41.836 09:38:50 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:42.095 09:38:50 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:42.095 09:38:50 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:42.095 09:38:51 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:42.095 09:38:51 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:42.353 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:42.353 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.105 ms 00:34:42.353 00:34:42.353 --- 10.0.0.2 ping statistics --- 00:34:42.353 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:42.353 rtt min/avg/max/mdev = 0.105/0.105/0.105/0.000 ms 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:42.353 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:42.353 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:34:42.353 00:34:42.353 --- 10.0.0.3 ping statistics --- 00:34:42.353 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:42.353 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:42.353 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:42.353 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:34:42.353 00:34:42.353 --- 10.0.0.1 ping statistics --- 00:34:42.353 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:42.353 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@433 -- # return 0 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:42.353 09:38:51 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@481 -- # nvmfpid=295711 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@482 -- # waitforlisten 295711 00:34:42.353 09:38:51 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@829 -- # '[' -z 295711 ']' 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:42.353 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:42.353 09:38:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:42.353 [2024-07-15 09:38:51.195281] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:42.353 [2024-07-15 09:38:51.195349] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:42.611 [2024-07-15 09:38:51.324225] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:42.611 [2024-07-15 09:38:51.431719] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:42.611 [2024-07-15 09:38:51.431766] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:42.611 [2024-07-15 09:38:51.431781] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:42.611 [2024-07-15 09:38:51.431794] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:42.611 [2024-07-15 09:38:51.431805] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:42.611 [2024-07-15 09:38:51.431835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:43.177 09:38:52 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:43.177 09:38:52 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:43.177 09:38:52 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:43.177 09:38:52 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:43.177 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.435 09:38:52 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.dKUfCLHdZG 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.2rfrVCUncy 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:34:43.435 09:38:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:43.435 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.435 malloc0 00:34:43.435 true 00:34:43.435 true 00:34:43.435 [2024-07-15 09:38:52.214475] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:43.435 crypto0 00:34:43.435 [2024-07-15 09:38:52.222501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:43.435 crypto1 00:34:43.435 [2024-07-15 09:38:52.230622] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:43.435 [2024-07-15 09:38:52.246832] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:43.435 09:38:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@85 -- # update_stats 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:43.435 09:38:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.436 09:38:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:43.436 09:38:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:43.694 09:38:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:43.694 09:38:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:43.694 09:38:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.dKUfCLHdZG bs=1K count=64 00:34:43.694 64+0 records in 00:34:43.694 64+0 records out 00:34:43.694 65536 bytes (66 kB, 64 KiB) copied, 0.000469302 s, 140 MB/s 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.dKUfCLHdZG --ob Nvme0n1 --bs 65536 --count 1 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@25 -- # local config 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:43.694 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:43.694 "subsystems": [ 00:34:43.694 { 00:34:43.694 "subsystem": "bdev", 00:34:43.694 "config": [ 00:34:43.694 { 00:34:43.694 "method": "bdev_nvme_attach_controller", 00:34:43.694 "params": { 00:34:43.694 "trtype": "tcp", 00:34:43.694 "adrfam": "IPv4", 00:34:43.694 "name": "Nvme0", 00:34:43.694 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:43.694 "traddr": "10.0.0.2", 00:34:43.694 "trsvcid": "4420" 00:34:43.694 } 00:34:43.694 }, 00:34:43.694 { 00:34:43.694 "method": "bdev_set_options", 00:34:43.694 "params": { 00:34:43.694 "bdev_auto_examine": false 00:34:43.694 } 00:34:43.694 } 00:34:43.694 ] 00:34:43.694 } 00:34:43.694 ] 00:34:43.694 }' 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.dKUfCLHdZG --ob Nvme0n1 --bs 65536 --count 1 00:34:43.694 09:38:52 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:43.694 "subsystems": [ 00:34:43.694 { 00:34:43.694 "subsystem": "bdev", 00:34:43.694 "config": [ 00:34:43.694 { 00:34:43.694 "method": "bdev_nvme_attach_controller", 00:34:43.694 "params": { 00:34:43.694 "trtype": "tcp", 00:34:43.694 "adrfam": "IPv4", 00:34:43.694 "name": "Nvme0", 00:34:43.694 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:43.694 "traddr": "10.0.0.2", 00:34:43.694 "trsvcid": "4420" 00:34:43.694 } 00:34:43.694 }, 00:34:43.694 { 00:34:43.694 "method": "bdev_set_options", 00:34:43.694 "params": { 00:34:43.694 "bdev_auto_examine": false 00:34:43.694 } 00:34:43.694 } 00:34:43.694 ] 00:34:43.694 } 00:34:43.694 ] 00:34:43.694 }' 00:34:43.694 [2024-07-15 09:38:52.546767] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:43.694 [2024-07-15 09:38:52.546818] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid295934 ] 00:34:43.952 [2024-07-15 09:38:52.659152] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:43.952 [2024-07-15 09:38:52.756183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:44.469  Copying: 64/64 [kB] (average 10 MBps) 00:34:44.469 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@96 -- # update_stats 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:44.469 09:38:53 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.469 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:44.727 09:38:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.2rfrVCUncy --ib Nvme0n1 --bs 65536 --count 1 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@25 -- # local config 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:44.727 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:44.727 "subsystems": [ 00:34:44.727 { 00:34:44.727 "subsystem": "bdev", 00:34:44.727 "config": [ 00:34:44.727 { 00:34:44.727 "method": "bdev_nvme_attach_controller", 00:34:44.727 "params": { 00:34:44.727 "trtype": "tcp", 00:34:44.727 "adrfam": "IPv4", 00:34:44.727 "name": "Nvme0", 00:34:44.727 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:44.727 "traddr": "10.0.0.2", 00:34:44.727 "trsvcid": "4420" 00:34:44.727 } 00:34:44.727 }, 00:34:44.727 { 00:34:44.727 "method": "bdev_set_options", 00:34:44.727 "params": { 00:34:44.727 "bdev_auto_examine": false 00:34:44.727 } 00:34:44.727 } 00:34:44.727 ] 00:34:44.727 } 00:34:44.727 ] 00:34:44.727 }' 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:44.727 "subsystems": [ 00:34:44.727 { 00:34:44.727 "subsystem": "bdev", 00:34:44.727 "config": [ 00:34:44.727 { 00:34:44.727 "method": "bdev_nvme_attach_controller", 00:34:44.727 "params": { 00:34:44.727 "trtype": "tcp", 00:34:44.727 "adrfam": "IPv4", 00:34:44.727 "name": "Nvme0", 00:34:44.727 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:44.727 "traddr": "10.0.0.2", 00:34:44.727 "trsvcid": "4420" 00:34:44.727 } 00:34:44.727 }, 00:34:44.727 { 00:34:44.727 "method": "bdev_set_options", 00:34:44.727 "params": { 00:34:44.727 "bdev_auto_examine": false 00:34:44.727 } 00:34:44.727 } 00:34:44.727 ] 00:34:44.727 } 00:34:44.727 ] 00:34:44.727 }' 00:34:44.727 09:38:53 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.2rfrVCUncy --ib Nvme0n1 --bs 65536 --count 1 00:34:44.985 [2024-07-15 09:38:53.695745] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:44.985 [2024-07-15 09:38:53.695812] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296144 ] 00:34:44.985 [2024-07-15 09:38:53.826974] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:44.985 [2024-07-15 09:38:53.925576] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:45.502  Copying: 64/64 [kB] (average 12 MBps) 00:34:45.502 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:45.502 09:38:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:45.502 09:38:54 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:45.760 09:38:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.dKUfCLHdZG /tmp/tmp.2rfrVCUncy 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@25 -- # local config 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:45.760 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:45.760 "subsystems": [ 00:34:45.760 { 00:34:45.760 "subsystem": "bdev", 00:34:45.760 "config": [ 00:34:45.760 { 00:34:45.760 "method": "bdev_nvme_attach_controller", 00:34:45.760 "params": { 00:34:45.760 "trtype": "tcp", 00:34:45.760 "adrfam": "IPv4", 00:34:45.760 "name": "Nvme0", 00:34:45.760 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:45.760 "traddr": "10.0.0.2", 00:34:45.760 "trsvcid": "4420" 00:34:45.760 } 00:34:45.760 }, 00:34:45.760 { 00:34:45.760 "method": "bdev_set_options", 00:34:45.760 "params": { 00:34:45.760 "bdev_auto_examine": false 00:34:45.760 } 00:34:45.760 } 00:34:45.760 ] 00:34:45.760 } 00:34:45.760 ] 00:34:45.760 }' 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:45.760 09:38:54 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:45.760 "subsystems": [ 00:34:45.760 { 00:34:45.760 "subsystem": "bdev", 00:34:45.760 "config": [ 00:34:45.760 { 00:34:45.760 "method": "bdev_nvme_attach_controller", 00:34:45.760 "params": { 00:34:45.760 "trtype": "tcp", 00:34:45.760 "adrfam": "IPv4", 00:34:45.760 "name": "Nvme0", 00:34:45.760 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:45.760 "traddr": "10.0.0.2", 00:34:45.761 "trsvcid": "4420" 00:34:45.761 } 00:34:45.761 }, 00:34:45.761 { 00:34:45.761 "method": "bdev_set_options", 00:34:45.761 "params": { 00:34:45.761 "bdev_auto_examine": false 00:34:45.761 } 00:34:45.761 } 00:34:45.761 ] 00:34:45.761 } 00:34:45.761 ] 00:34:45.761 }' 00:34:45.761 [2024-07-15 09:38:54.646587] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:45.761 [2024-07-15 09:38:54.646652] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296343 ] 00:34:46.018 [2024-07-15 09:38:54.777181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:46.018 [2024-07-15 09:38:54.874581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:46.534  Copying: 64/64 [kB] (average 31 MBps) 00:34:46.534 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@106 -- # update_stats 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:46.534 09:38:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:34:46.534 09:38:55 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:46.535 09:38:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.dKUfCLHdZG --ob Nvme0n1 --bs 4096 --count 16 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@25 -- # local config 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:46.535 09:38:55 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:46.535 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:46.793 09:38:55 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:46.793 "subsystems": [ 00:34:46.793 { 00:34:46.793 "subsystem": "bdev", 00:34:46.793 "config": [ 00:34:46.793 { 00:34:46.794 "method": "bdev_nvme_attach_controller", 00:34:46.794 "params": { 00:34:46.794 "trtype": "tcp", 00:34:46.794 "adrfam": "IPv4", 00:34:46.794 "name": "Nvme0", 00:34:46.794 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:46.794 "traddr": "10.0.0.2", 00:34:46.794 "trsvcid": "4420" 00:34:46.794 } 00:34:46.794 }, 00:34:46.794 { 00:34:46.794 "method": "bdev_set_options", 00:34:46.794 "params": { 00:34:46.794 "bdev_auto_examine": false 00:34:46.794 } 00:34:46.794 } 00:34:46.794 ] 00:34:46.794 } 00:34:46.794 ] 00:34:46.794 }' 00:34:46.794 09:38:55 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.dKUfCLHdZG --ob Nvme0n1 --bs 4096 --count 16 00:34:46.794 09:38:55 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:46.794 "subsystems": [ 00:34:46.794 { 00:34:46.794 "subsystem": "bdev", 00:34:46.794 "config": [ 00:34:46.794 { 00:34:46.794 "method": "bdev_nvme_attach_controller", 00:34:46.794 "params": { 00:34:46.794 "trtype": "tcp", 00:34:46.794 "adrfam": "IPv4", 00:34:46.794 "name": "Nvme0", 00:34:46.794 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:46.794 "traddr": "10.0.0.2", 00:34:46.794 "trsvcid": "4420" 00:34:46.794 } 00:34:46.794 }, 00:34:46.794 { 00:34:46.794 "method": "bdev_set_options", 00:34:46.794 "params": { 00:34:46.794 "bdev_auto_examine": false 00:34:46.794 } 00:34:46.794 } 00:34:46.794 ] 00:34:46.794 } 00:34:46.794 ] 00:34:46.794 }' 00:34:46.794 [2024-07-15 09:38:55.541897] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:46.794 [2024-07-15 09:38:55.541972] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296382 ] 00:34:46.794 [2024-07-15 09:38:55.671674] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.052 [2024-07-15 09:38:55.771754] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.312  Copying: 64/64 [kB] (average 12 MBps) 00:34:47.312 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:47.312 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.312 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.312 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.312 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:47.312 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.312 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.570 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@114 -- # update_stats 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.571 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.571 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:47.829 09:38:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:47.829 09:38:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:47.829 09:38:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@117 -- # : 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.2rfrVCUncy --ib Nvme0n1 --bs 4096 --count 16 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@25 -- # local config 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:47.829 09:38:56 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:47.829 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:47.830 09:38:56 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:47.830 "subsystems": [ 00:34:47.830 { 00:34:47.830 "subsystem": "bdev", 00:34:47.830 "config": [ 00:34:47.830 { 00:34:47.830 "method": "bdev_nvme_attach_controller", 00:34:47.830 "params": { 00:34:47.830 "trtype": "tcp", 00:34:47.830 "adrfam": "IPv4", 00:34:47.830 "name": "Nvme0", 00:34:47.830 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:47.830 "traddr": "10.0.0.2", 00:34:47.830 "trsvcid": "4420" 00:34:47.830 } 00:34:47.830 }, 00:34:47.830 { 00:34:47.830 "method": "bdev_set_options", 00:34:47.830 "params": { 00:34:47.830 "bdev_auto_examine": false 00:34:47.830 } 00:34:47.830 } 00:34:47.830 ] 00:34:47.830 } 00:34:47.830 ] 00:34:47.830 }' 00:34:47.830 09:38:56 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.2rfrVCUncy --ib Nvme0n1 --bs 4096 --count 16 00:34:47.830 09:38:56 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:47.830 "subsystems": [ 00:34:47.830 { 00:34:47.830 "subsystem": "bdev", 00:34:47.830 "config": [ 00:34:47.830 { 00:34:47.830 "method": "bdev_nvme_attach_controller", 00:34:47.830 "params": { 00:34:47.830 "trtype": "tcp", 00:34:47.830 "adrfam": "IPv4", 00:34:47.830 "name": "Nvme0", 00:34:47.830 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:47.830 "traddr": "10.0.0.2", 00:34:47.830 "trsvcid": "4420" 00:34:47.830 } 00:34:47.830 }, 00:34:47.830 { 00:34:47.830 "method": "bdev_set_options", 00:34:47.830 "params": { 00:34:47.830 "bdev_auto_examine": false 00:34:47.830 } 00:34:47.830 } 00:34:47.830 ] 00:34:47.830 } 00:34:47.830 ] 00:34:47.830 }' 00:34:47.830 [2024-07-15 09:38:56.701482] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:47.830 [2024-07-15 09:38:56.701547] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296587 ] 00:34:48.088 [2024-07-15 09:38:56.829011] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:48.088 [2024-07-15 09:38:56.936363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:48.604  Copying: 64/64 [kB] (average 1454 kBps) 00:34:48.604 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:48.604 09:38:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:48.605 09:38:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.605 09:38:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.dKUfCLHdZG /tmp/tmp.2rfrVCUncy 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.dKUfCLHdZG /tmp/tmp.2rfrVCUncy 00:34:48.863 09:38:57 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@117 -- # sync 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@120 -- # set +e 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:48.863 rmmod nvme_tcp 00:34:48.863 rmmod nvme_fabrics 00:34:48.863 rmmod nvme_keyring 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@124 -- # set -e 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@125 -- # return 0 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@489 -- # '[' -n 295711 ']' 00:34:48.863 09:38:57 chaining -- nvmf/common.sh@490 -- # killprocess 295711 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@948 -- # '[' -z 295711 ']' 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@952 -- # kill -0 295711 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@953 -- # uname 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 295711 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 295711' 00:34:48.863 killing process with pid 295711 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@967 -- # kill 295711 00:34:48.863 09:38:57 chaining -- common/autotest_common.sh@972 -- # wait 295711 00:34:49.120 09:38:57 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:49.120 09:38:58 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:34:49.120 09:38:58 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:49.120 09:38:58 chaining -- bdev/chaining.sh@132 -- # bperfpid=296796 00:34:49.120 09:38:58 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:49.120 09:38:58 chaining -- bdev/chaining.sh@134 -- # waitforlisten 296796 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@829 -- # '[' -z 296796 ']' 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:49.120 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:49.120 09:38:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:49.378 [2024-07-15 09:38:58.117497] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:49.378 [2024-07-15 09:38:58.117567] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296796 ] 00:34:49.378 [2024-07-15 09:38:58.248187] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:49.637 [2024-07-15 09:38:58.349318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:50.204 09:38:58 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:50.204 09:38:58 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:50.204 09:38:58 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:34:50.204 09:38:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:50.204 09:38:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:50.204 malloc0 00:34:50.204 true 00:34:50.204 true 00:34:50.204 [2024-07-15 09:38:59.121148] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:50.204 crypto0 00:34:50.204 [2024-07-15 09:38:59.129171] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:50.204 crypto1 00:34:50.204 09:38:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:50.204 09:38:59 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:50.461 Running I/O for 5 seconds... 00:34:55.727 00:34:55.728 Latency(us) 00:34:55.728 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:55.728 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:55.728 Verification LBA range: start 0x0 length 0x2000 00:34:55.728 crypto1 : 5.01 11425.36 44.63 0.00 0.00 22339.42 6240.17 15386.71 00:34:55.728 =================================================================================================================== 00:34:55.728 Total : 11425.36 44.63 0.00 0.00 22339.42 6240.17 15386.71 00:34:55.728 0 00:34:55.728 09:39:04 chaining -- bdev/chaining.sh@146 -- # killprocess 296796 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@948 -- # '[' -z 296796 ']' 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@952 -- # kill -0 296796 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@953 -- # uname 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 296796 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 296796' 00:34:55.728 killing process with pid 296796 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@967 -- # kill 296796 00:34:55.728 Received shutdown signal, test time was about 5.000000 seconds 00:34:55.728 00:34:55.728 Latency(us) 00:34:55.728 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:55.728 =================================================================================================================== 00:34:55.728 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@972 -- # wait 296796 00:34:55.728 09:39:04 chaining -- bdev/chaining.sh@152 -- # bperfpid=297916 00:34:55.728 09:39:04 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:55.728 09:39:04 chaining -- bdev/chaining.sh@154 -- # waitforlisten 297916 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@829 -- # '[' -z 297916 ']' 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:55.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:55.728 09:39:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:55.728 [2024-07-15 09:39:04.624514] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:34:55.728 [2024-07-15 09:39:04.624586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid297916 ] 00:34:55.986 [2024-07-15 09:39:04.755015] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:55.986 [2024-07-15 09:39:04.860848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:56.921 09:39:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:56.921 09:39:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:56.921 09:39:05 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:34:56.921 09:39:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.921 09:39:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.921 malloc0 00:34:56.921 true 00:34:56.921 true 00:34:56.921 [2024-07-15 09:39:05.703574] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:34:56.921 [2024-07-15 09:39:05.703621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:56.921 [2024-07-15 09:39:05.703642] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc6730 00:34:56.921 [2024-07-15 09:39:05.703655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:56.922 [2024-07-15 09:39:05.704736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:56.922 [2024-07-15 09:39:05.704765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:34:56.922 pt0 00:34:56.922 [2024-07-15 09:39:05.711604] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:56.922 crypto0 00:34:56.922 [2024-07-15 09:39:05.719624] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:56.922 crypto1 00:34:56.922 09:39:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.922 09:39:05 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:56.922 Running I/O for 5 seconds... 00:35:02.183 00:35:02.183 Latency(us) 00:35:02.183 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:02.183 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:02.183 Verification LBA range: start 0x0 length 0x2000 00:35:02.183 crypto1 : 5.02 9083.05 35.48 0.00 0.00 28108.10 6553.60 16868.40 00:35:02.183 =================================================================================================================== 00:35:02.183 Total : 9083.05 35.48 0.00 0.00 28108.10 6553.60 16868.40 00:35:02.183 0 00:35:02.183 09:39:10 chaining -- bdev/chaining.sh@167 -- # killprocess 297916 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 297916 ']' 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@952 -- # kill -0 297916 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@953 -- # uname 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 297916 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 297916' 00:35:02.183 killing process with pid 297916 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@967 -- # kill 297916 00:35:02.183 Received shutdown signal, test time was about 5.000000 seconds 00:35:02.183 00:35:02.183 Latency(us) 00:35:02.183 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:02.183 =================================================================================================================== 00:35:02.183 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:02.183 09:39:10 chaining -- common/autotest_common.sh@972 -- # wait 297916 00:35:02.442 09:39:11 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:02.442 09:39:11 chaining -- bdev/chaining.sh@170 -- # killprocess 297916 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@948 -- # '[' -z 297916 ']' 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@952 -- # kill -0 297916 00:35:02.442 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (297916) - No such process 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 297916 is not found' 00:35:02.442 Process with pid 297916 is not found 00:35:02.442 09:39:11 chaining -- bdev/chaining.sh@171 -- # wait 297916 00:35:02.442 09:39:11 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:02.442 09:39:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@336 -- # return 1 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:02.442 WARNING: No supported devices were found, fallback requested for tcp test 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:02.442 Cannot find device "nvmf_tgt_br" 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@155 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:02.442 Cannot find device "nvmf_tgt_br2" 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@156 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:02.442 Cannot find device "nvmf_tgt_br" 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@158 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:02.442 Cannot find device "nvmf_tgt_br2" 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@159 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:02.442 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@162 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:02.442 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@163 -- # true 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:02.442 09:39:11 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:02.699 09:39:11 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:02.991 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:02.991 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.113 ms 00:35:02.991 00:35:02.991 --- 10.0.0.2 ping statistics --- 00:35:02.991 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:02.991 rtt min/avg/max/mdev = 0.113/0.113/0.113/0.000 ms 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:02.991 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:02.991 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:35:02.991 00:35:02.991 --- 10.0.0.3 ping statistics --- 00:35:02.991 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:02.991 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:35:02.991 09:39:11 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:03.249 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:03.249 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:35:03.249 00:35:03.249 --- 10.0.0.1 ping statistics --- 00:35:03.249 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:03.249 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@433 -- # return 0 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:03.249 09:39:11 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:03.249 09:39:11 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:03.249 09:39:11 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:03.249 09:39:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:03.249 09:39:12 chaining -- nvmf/common.sh@481 -- # nvmfpid=299323 00:35:03.249 09:39:12 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:03.249 09:39:12 chaining -- nvmf/common.sh@482 -- # waitforlisten 299323 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@829 -- # '[' -z 299323 ']' 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:03.249 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:03.249 09:39:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:03.249 [2024-07-15 09:39:12.080269] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:35:03.249 [2024-07-15 09:39:12.080339] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:03.507 [2024-07-15 09:39:12.207457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:03.507 [2024-07-15 09:39:12.312439] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:03.507 [2024-07-15 09:39:12.312489] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:03.507 [2024-07-15 09:39:12.312509] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:03.507 [2024-07-15 09:39:12.312522] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:03.508 [2024-07-15 09:39:12.312534] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:03.508 [2024-07-15 09:39:12.312566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:04.073 09:39:12 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:04.073 09:39:12 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:04.073 09:39:12 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:04.073 09:39:12 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:04.073 09:39:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:04.332 09:39:13 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:04.332 09:39:13 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:04.332 malloc0 00:35:04.332 [2024-07-15 09:39:13.060241] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:04.332 [2024-07-15 09:39:13.076440] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:04.332 09:39:13 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:04.332 09:39:13 chaining -- bdev/chaining.sh@189 -- # bperfpid=299516 00:35:04.332 09:39:13 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:04.332 09:39:13 chaining -- bdev/chaining.sh@191 -- # waitforlisten 299516 /var/tmp/bperf.sock 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@829 -- # '[' -z 299516 ']' 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:04.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:04.332 09:39:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:04.332 [2024-07-15 09:39:13.149131] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:35:04.332 [2024-07-15 09:39:13.149193] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid299516 ] 00:35:04.332 [2024-07-15 09:39:13.278478] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:04.590 [2024-07-15 09:39:13.381349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:05.156 09:39:14 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:05.156 09:39:14 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:05.156 09:39:14 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:05.156 09:39:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:05.722 [2024-07-15 09:39:14.481494] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:05.722 nvme0n1 00:35:05.722 true 00:35:05.722 crypto0 00:35:05.722 09:39:14 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:05.722 Running I/O for 5 seconds... 00:35:10.991 00:35:10.991 Latency(us) 00:35:10.991 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:10.991 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:10.991 Verification LBA range: start 0x0 length 0x2000 00:35:10.991 crypto0 : 5.02 8339.55 32.58 0.00 0.00 30597.54 2165.54 24276.81 00:35:10.991 =================================================================================================================== 00:35:10.991 Total : 8339.55 32.58 0.00 0.00 30597.54 2165.54 24276.81 00:35:10.991 0 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@205 -- # sequence=83736 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:10.991 09:39:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@206 -- # encrypt=41868 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:11.249 09:39:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:11.250 09:39:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:11.250 09:39:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:11.250 09:39:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@207 -- # decrypt=41868 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:11.508 09:39:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:11.767 09:39:20 chaining -- bdev/chaining.sh@208 -- # crc32c=83736 00:35:11.767 09:39:20 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:11.767 09:39:20 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:11.767 09:39:20 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:11.767 09:39:20 chaining -- bdev/chaining.sh@214 -- # killprocess 299516 00:35:11.767 09:39:20 chaining -- common/autotest_common.sh@948 -- # '[' -z 299516 ']' 00:35:11.767 09:39:20 chaining -- common/autotest_common.sh@952 -- # kill -0 299516 00:35:11.767 09:39:20 chaining -- common/autotest_common.sh@953 -- # uname 00:35:11.767 09:39:20 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:11.767 09:39:20 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 299516 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 299516' 00:35:12.026 killing process with pid 299516 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@967 -- # kill 299516 00:35:12.026 Received shutdown signal, test time was about 5.000000 seconds 00:35:12.026 00:35:12.026 Latency(us) 00:35:12.026 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:12.026 =================================================================================================================== 00:35:12.026 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@972 -- # wait 299516 00:35:12.026 09:39:20 chaining -- bdev/chaining.sh@219 -- # bperfpid=300567 00:35:12.026 09:39:20 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:12.026 09:39:20 chaining -- bdev/chaining.sh@221 -- # waitforlisten 300567 /var/tmp/bperf.sock 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@829 -- # '[' -z 300567 ']' 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:12.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:12.026 09:39:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:12.283 [2024-07-15 09:39:21.020286] Starting SPDK v24.09-pre git sha1 4835eb82b / DPDK 24.03.0 initialization... 00:35:12.283 [2024-07-15 09:39:21.020354] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid300567 ] 00:35:12.283 [2024-07-15 09:39:21.150608] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:12.541 [2024-07-15 09:39:21.253337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:13.106 09:39:21 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:13.106 09:39:21 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:13.106 09:39:21 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:13.106 09:39:21 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:13.672 [2024-07-15 09:39:22.382182] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:13.672 nvme0n1 00:35:13.672 true 00:35:13.672 crypto0 00:35:13.672 09:39:22 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:13.672 Running I/O for 5 seconds... 00:35:18.939 00:35:18.939 Latency(us) 00:35:18.939 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:18.939 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:35:18.939 Verification LBA range: start 0x0 length 0x200 00:35:18.939 crypto0 : 5.01 1687.99 105.50 0.00 0.00 18581.26 1132.63 18919.96 00:35:18.939 =================================================================================================================== 00:35:18.939 Total : 1687.99 105.50 0.00 0.00 18581.26 1132.63 18919.96 00:35:18.939 0 00:35:18.939 09:39:27 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:35:18.939 09:39:27 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:18.939 09:39:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@233 -- # sequence=16902 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:18.940 09:39:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@234 -- # encrypt=8451 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:19.197 09:39:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@235 -- # decrypt=8451 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:19.454 09:39:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:19.712 09:39:28 chaining -- bdev/chaining.sh@236 -- # crc32c=16902 00:35:19.712 09:39:28 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:35:19.712 09:39:28 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:35:19.712 09:39:28 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:35:19.712 09:39:28 chaining -- bdev/chaining.sh@242 -- # killprocess 300567 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@948 -- # '[' -z 300567 ']' 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@952 -- # kill -0 300567 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@953 -- # uname 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 300567 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 300567' 00:35:19.712 killing process with pid 300567 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@967 -- # kill 300567 00:35:19.712 Received shutdown signal, test time was about 5.000000 seconds 00:35:19.712 00:35:19.712 Latency(us) 00:35:19.712 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:19.712 =================================================================================================================== 00:35:19.712 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:19.712 09:39:28 chaining -- common/autotest_common.sh@972 -- # wait 300567 00:35:19.970 09:39:28 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@117 -- # sync 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@120 -- # set +e 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:19.970 rmmod nvme_tcp 00:35:19.970 rmmod nvme_fabrics 00:35:19.970 rmmod nvme_keyring 00:35:19.970 09:39:28 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:20.228 09:39:28 chaining -- nvmf/common.sh@124 -- # set -e 00:35:20.228 09:39:28 chaining -- nvmf/common.sh@125 -- # return 0 00:35:20.228 09:39:28 chaining -- nvmf/common.sh@489 -- # '[' -n 299323 ']' 00:35:20.228 09:39:28 chaining -- nvmf/common.sh@490 -- # killprocess 299323 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@948 -- # '[' -z 299323 ']' 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@952 -- # kill -0 299323 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@953 -- # uname 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 299323 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 299323' 00:35:20.228 killing process with pid 299323 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@967 -- # kill 299323 00:35:20.228 09:39:28 chaining -- common/autotest_common.sh@972 -- # wait 299323 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:20.486 09:39:29 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:20.486 09:39:29 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:20.486 09:39:29 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:20.486 09:39:29 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:35:20.486 00:35:20.486 real 0m45.713s 00:35:20.486 user 0m59.667s 00:35:20.486 sys 0m12.943s 00:35:20.486 09:39:29 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:20.486 09:39:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:20.486 ************************************ 00:35:20.486 END TEST chaining 00:35:20.486 ************************************ 00:35:20.486 09:39:29 -- common/autotest_common.sh@1142 -- # return 0 00:35:20.486 09:39:29 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:35:20.486 09:39:29 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:35:20.486 09:39:29 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:35:20.486 09:39:29 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:35:20.486 09:39:29 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:35:20.486 09:39:29 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:35:20.486 09:39:29 -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:20.486 09:39:29 -- common/autotest_common.sh@10 -- # set +x 00:35:20.486 09:39:29 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:35:20.486 09:39:29 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:35:20.486 09:39:29 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:35:20.486 09:39:29 -- common/autotest_common.sh@10 -- # set +x 00:35:25.830 INFO: APP EXITING 00:35:25.830 INFO: killing all VMs 00:35:25.830 INFO: killing vhost app 00:35:25.830 INFO: EXIT DONE 00:35:29.116 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:29.116 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:29.116 Waiting for block devices as requested 00:35:29.116 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:35:29.116 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:29.375 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:29.375 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:29.375 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:29.633 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:29.633 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:29.633 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:29.892 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:29.892 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:29.892 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:30.151 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:30.151 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:30.151 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:30.409 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:30.409 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:30.409 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:34.596 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:34.596 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:34.596 Cleaning 00:35:34.596 Removing: /var/run/dpdk/spdk0/config 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:35:34.596 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:35:34.596 Removing: /var/run/dpdk/spdk0/hugepage_info 00:35:34.596 Removing: /dev/shm/nvmf_trace.0 00:35:34.596 Removing: /dev/shm/spdk_tgt_trace.pid43755 00:35:34.596 Removing: /var/run/dpdk/spdk0 00:35:34.596 Removing: /var/run/dpdk/spdk_pid100165 00:35:34.596 Removing: /var/run/dpdk/spdk_pid102600 00:35:34.596 Removing: /var/run/dpdk/spdk_pid103416 00:35:34.596 Removing: /var/run/dpdk/spdk_pid113497 00:35:34.596 Removing: /var/run/dpdk/spdk_pid115710 00:35:34.596 Removing: /var/run/dpdk/spdk_pid116688 00:35:34.596 Removing: /var/run/dpdk/spdk_pid126777 00:35:34.596 Removing: /var/run/dpdk/spdk_pid129003 00:35:34.854 Removing: /var/run/dpdk/spdk_pid129985 00:35:34.854 Removing: /var/run/dpdk/spdk_pid140041 00:35:34.854 Removing: /var/run/dpdk/spdk_pid143314 00:35:34.854 Removing: /var/run/dpdk/spdk_pid144312 00:35:34.854 Removing: /var/run/dpdk/spdk_pid155193 00:35:34.854 Removing: /var/run/dpdk/spdk_pid157533 00:35:34.854 Removing: /var/run/dpdk/spdk_pid158646 00:35:34.854 Removing: /var/run/dpdk/spdk_pid169877 00:35:34.854 Removing: /var/run/dpdk/spdk_pid172333 00:35:34.854 Removing: /var/run/dpdk/spdk_pid173617 00:35:34.854 Removing: /var/run/dpdk/spdk_pid184586 00:35:34.855 Removing: /var/run/dpdk/spdk_pid188622 00:35:34.855 Removing: /var/run/dpdk/spdk_pid189770 00:35:34.855 Removing: /var/run/dpdk/spdk_pid190760 00:35:34.855 Removing: /var/run/dpdk/spdk_pid194311 00:35:34.855 Removing: /var/run/dpdk/spdk_pid199502 00:35:34.855 Removing: /var/run/dpdk/spdk_pid202016 00:35:34.855 Removing: /var/run/dpdk/spdk_pid206320 00:35:34.855 Removing: /var/run/dpdk/spdk_pid209728 00:35:34.855 Removing: /var/run/dpdk/spdk_pid215112 00:35:34.855 Removing: /var/run/dpdk/spdk_pid217820 00:35:34.855 Removing: /var/run/dpdk/spdk_pid224793 00:35:34.855 Removing: /var/run/dpdk/spdk_pid227065 00:35:34.855 Removing: /var/run/dpdk/spdk_pid233204 00:35:34.855 Removing: /var/run/dpdk/spdk_pid235545 00:35:34.855 Removing: /var/run/dpdk/spdk_pid241575 00:35:34.855 Removing: /var/run/dpdk/spdk_pid243990 00:35:34.855 Removing: /var/run/dpdk/spdk_pid248654 00:35:34.855 Removing: /var/run/dpdk/spdk_pid249008 00:35:34.855 Removing: /var/run/dpdk/spdk_pid249367 00:35:34.855 Removing: /var/run/dpdk/spdk_pid249720 00:35:34.855 Removing: /var/run/dpdk/spdk_pid250221 00:35:34.855 Removing: /var/run/dpdk/spdk_pid251066 00:35:34.855 Removing: /var/run/dpdk/spdk_pid251773 00:35:34.855 Removing: /var/run/dpdk/spdk_pid252220 00:35:34.855 Removing: /var/run/dpdk/spdk_pid253908 00:35:34.855 Removing: /var/run/dpdk/spdk_pid255580 00:35:34.855 Removing: /var/run/dpdk/spdk_pid257190 00:35:34.855 Removing: /var/run/dpdk/spdk_pid258490 00:35:34.855 Removing: /var/run/dpdk/spdk_pid260092 00:35:34.855 Removing: /var/run/dpdk/spdk_pid261691 00:35:34.855 Removing: /var/run/dpdk/spdk_pid263293 00:35:34.855 Removing: /var/run/dpdk/spdk_pid264592 00:35:34.855 Removing: /var/run/dpdk/spdk_pid265186 00:35:34.855 Removing: /var/run/dpdk/spdk_pid265671 00:35:34.855 Removing: /var/run/dpdk/spdk_pid267679 00:35:34.855 Removing: /var/run/dpdk/spdk_pid269599 00:35:34.855 Removing: /var/run/dpdk/spdk_pid271919 00:35:34.855 Removing: /var/run/dpdk/spdk_pid273016 00:35:34.855 Removing: /var/run/dpdk/spdk_pid274171 00:35:34.855 Removing: /var/run/dpdk/spdk_pid274727 00:35:34.855 Removing: /var/run/dpdk/spdk_pid274912 00:35:34.855 Removing: /var/run/dpdk/spdk_pid274986 00:35:34.855 Removing: /var/run/dpdk/spdk_pid275215 00:35:34.855 Removing: /var/run/dpdk/spdk_pid275372 00:35:34.855 Removing: /var/run/dpdk/spdk_pid276607 00:35:34.855 Removing: /var/run/dpdk/spdk_pid278120 00:35:35.113 Removing: /var/run/dpdk/spdk_pid279615 00:35:35.113 Removing: /var/run/dpdk/spdk_pid280337 00:35:35.113 Removing: /var/run/dpdk/spdk_pid281219 00:35:35.113 Removing: /var/run/dpdk/spdk_pid281415 00:35:35.113 Removing: /var/run/dpdk/spdk_pid281441 00:35:35.113 Removing: /var/run/dpdk/spdk_pid281628 00:35:35.113 Removing: /var/run/dpdk/spdk_pid282468 00:35:35.113 Removing: /var/run/dpdk/spdk_pid283104 00:35:35.113 Removing: /var/run/dpdk/spdk_pid283484 00:35:35.113 Removing: /var/run/dpdk/spdk_pid285586 00:35:35.113 Removing: /var/run/dpdk/spdk_pid287379 00:35:35.113 Removing: /var/run/dpdk/spdk_pid289180 00:35:35.113 Removing: /var/run/dpdk/spdk_pid290235 00:35:35.113 Removing: /var/run/dpdk/spdk_pid291472 00:35:35.113 Removing: /var/run/dpdk/spdk_pid292014 00:35:35.113 Removing: /var/run/dpdk/spdk_pid292035 00:35:35.113 Removing: /var/run/dpdk/spdk_pid295934 00:35:35.113 Removing: /var/run/dpdk/spdk_pid296144 00:35:35.113 Removing: /var/run/dpdk/spdk_pid296343 00:35:35.113 Removing: /var/run/dpdk/spdk_pid296382 00:35:35.113 Removing: /var/run/dpdk/spdk_pid296587 00:35:35.113 Removing: /var/run/dpdk/spdk_pid296796 00:35:35.113 Removing: /var/run/dpdk/spdk_pid297916 00:35:35.113 Removing: /var/run/dpdk/spdk_pid299516 00:35:35.113 Removing: /var/run/dpdk/spdk_pid300567 00:35:35.113 Removing: /var/run/dpdk/spdk_pid42847 00:35:35.113 Removing: /var/run/dpdk/spdk_pid43755 00:35:35.113 Removing: /var/run/dpdk/spdk_pid44285 00:35:35.113 Removing: /var/run/dpdk/spdk_pid45021 00:35:35.113 Removing: /var/run/dpdk/spdk_pid45203 00:35:35.113 Removing: /var/run/dpdk/spdk_pid45965 00:35:35.113 Removing: /var/run/dpdk/spdk_pid46140 00:35:35.113 Removing: /var/run/dpdk/spdk_pid46426 00:35:35.113 Removing: /var/run/dpdk/spdk_pid49046 00:35:35.113 Removing: /var/run/dpdk/spdk_pid50381 00:35:35.113 Removing: /var/run/dpdk/spdk_pid50613 00:35:35.113 Removing: /var/run/dpdk/spdk_pid50853 00:35:35.113 Removing: /var/run/dpdk/spdk_pid51262 00:35:35.113 Removing: /var/run/dpdk/spdk_pid51503 00:35:35.113 Removing: /var/run/dpdk/spdk_pid51704 00:35:35.113 Removing: /var/run/dpdk/spdk_pid51902 00:35:35.113 Removing: /var/run/dpdk/spdk_pid52126 00:35:35.113 Removing: /var/run/dpdk/spdk_pid52877 00:35:35.113 Removing: /var/run/dpdk/spdk_pid55586 00:35:35.113 Removing: /var/run/dpdk/spdk_pid55779 00:35:35.113 Removing: /var/run/dpdk/spdk_pid56021 00:35:35.113 Removing: /var/run/dpdk/spdk_pid56235 00:35:35.113 Removing: /var/run/dpdk/spdk_pid56425 00:35:35.113 Removing: /var/run/dpdk/spdk_pid56498 00:35:35.113 Removing: /var/run/dpdk/spdk_pid56727 00:35:35.113 Removing: /var/run/dpdk/spdk_pid57036 00:35:35.113 Removing: /var/run/dpdk/spdk_pid57244 00:35:35.113 Removing: /var/run/dpdk/spdk_pid57548 00:35:35.113 Removing: /var/run/dpdk/spdk_pid57768 00:35:35.113 Removing: /var/run/dpdk/spdk_pid58090 00:35:35.113 Removing: /var/run/dpdk/spdk_pid58594 00:35:35.113 Removing: /var/run/dpdk/spdk_pid58898 00:35:35.113 Removing: /var/run/dpdk/spdk_pid59103 00:35:35.370 Removing: /var/run/dpdk/spdk_pid59296 00:35:35.370 Removing: /var/run/dpdk/spdk_pid59505 00:35:35.370 Removing: /var/run/dpdk/spdk_pid59700 00:35:35.370 Removing: /var/run/dpdk/spdk_pid59905 00:35:35.370 Removing: /var/run/dpdk/spdk_pid60245 00:35:35.370 Removing: /var/run/dpdk/spdk_pid60454 00:35:35.370 Removing: /var/run/dpdk/spdk_pid60648 00:35:35.370 Removing: /var/run/dpdk/spdk_pid60850 00:35:35.370 Removing: /var/run/dpdk/spdk_pid61053 00:35:35.370 Removing: /var/run/dpdk/spdk_pid61331 00:35:35.370 Removing: /var/run/dpdk/spdk_pid61605 00:35:35.370 Removing: /var/run/dpdk/spdk_pid61809 00:35:35.370 Removing: /var/run/dpdk/spdk_pid62174 00:35:35.370 Removing: /var/run/dpdk/spdk_pid62531 00:35:35.370 Removing: /var/run/dpdk/spdk_pid62738 00:35:35.370 Removing: /var/run/dpdk/spdk_pid63111 00:35:35.370 Removing: /var/run/dpdk/spdk_pid63478 00:35:35.370 Removing: /var/run/dpdk/spdk_pid63750 00:35:35.370 Removing: /var/run/dpdk/spdk_pid64043 00:35:35.370 Removing: /var/run/dpdk/spdk_pid64279 00:35:35.370 Removing: /var/run/dpdk/spdk_pid64535 00:35:35.370 Removing: /var/run/dpdk/spdk_pid65003 00:35:35.370 Removing: /var/run/dpdk/spdk_pid65378 00:35:35.370 Removing: /var/run/dpdk/spdk_pid65570 00:35:35.370 Removing: /var/run/dpdk/spdk_pid69539 00:35:35.370 Removing: /var/run/dpdk/spdk_pid71252 00:35:35.370 Removing: /var/run/dpdk/spdk_pid72935 00:35:35.370 Removing: /var/run/dpdk/spdk_pid73835 00:35:35.370 Removing: /var/run/dpdk/spdk_pid74904 00:35:35.370 Removing: /var/run/dpdk/spdk_pid75187 00:35:35.370 Removing: /var/run/dpdk/spdk_pid75299 00:35:35.370 Removing: /var/run/dpdk/spdk_pid75321 00:35:35.370 Removing: /var/run/dpdk/spdk_pid79107 00:35:35.370 Removing: /var/run/dpdk/spdk_pid79510 00:35:35.370 Removing: /var/run/dpdk/spdk_pid80549 00:35:35.370 Removing: /var/run/dpdk/spdk_pid80748 00:35:35.370 Removing: /var/run/dpdk/spdk_pid86749 00:35:35.370 Removing: /var/run/dpdk/spdk_pid88379 00:35:35.370 Removing: /var/run/dpdk/spdk_pid89192 00:35:35.370 Removing: /var/run/dpdk/spdk_pid93456 00:35:35.370 Removing: /var/run/dpdk/spdk_pid95082 00:35:35.370 Removing: /var/run/dpdk/spdk_pid96055 00:35:35.370 Clean 00:35:35.627 09:39:44 -- common/autotest_common.sh@1451 -- # return 0 00:35:35.627 09:39:44 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:35:35.627 09:39:44 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:35.627 09:39:44 -- common/autotest_common.sh@10 -- # set +x 00:35:35.627 09:39:44 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:35:35.627 09:39:44 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:35.627 09:39:44 -- common/autotest_common.sh@10 -- # set +x 00:35:35.627 09:39:44 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:35.627 09:39:44 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:35:35.627 09:39:44 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:35:35.627 09:39:44 -- spdk/autotest.sh@391 -- # hash lcov 00:35:35.627 09:39:44 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:35:35.628 09:39:44 -- spdk/autotest.sh@393 -- # hostname 00:35:35.628 09:39:44 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:35:35.886 geninfo: WARNING: invalid characters removed from testname! 00:36:07.942 09:40:11 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:11.225 09:40:19 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:17.830 09:40:25 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:23.090 09:40:31 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:29.646 09:40:37 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:31.542 09:40:40 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:34.098 09:40:42 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:34.098 09:40:43 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:34.098 09:40:43 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:34.098 09:40:43 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:34.098 09:40:43 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:34.098 09:40:43 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:34.098 09:40:43 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:34.098 09:40:43 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:34.098 09:40:43 -- paths/export.sh@5 -- $ export PATH 00:36:34.098 09:40:43 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:34.098 09:40:43 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:34.098 09:40:43 -- common/autobuild_common.sh@444 -- $ date +%s 00:36:34.098 09:40:43 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721029243.XXXXXX 00:36:34.098 09:40:43 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721029243.c7CQQa 00:36:34.098 09:40:43 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:36:34.098 09:40:43 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:36:34.098 09:40:43 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:34.098 09:40:43 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:34.098 09:40:43 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:34.098 09:40:43 -- common/autobuild_common.sh@460 -- $ get_config_params 00:36:34.098 09:40:43 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:36:34.098 09:40:43 -- common/autotest_common.sh@10 -- $ set +x 00:36:34.357 09:40:43 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:34.357 09:40:43 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:36:34.357 09:40:43 -- pm/common@17 -- $ local monitor 00:36:34.357 09:40:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:34.357 09:40:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:34.357 09:40:43 -- pm/common@21 -- $ date +%s 00:36:34.357 09:40:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:34.357 09:40:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:34.357 09:40:43 -- pm/common@21 -- $ date +%s 00:36:34.357 09:40:43 -- pm/common@25 -- $ sleep 1 00:36:34.357 09:40:43 -- pm/common@21 -- $ date +%s 00:36:34.357 09:40:43 -- pm/common@21 -- $ date +%s 00:36:34.357 09:40:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721029243 00:36:34.357 09:40:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721029243 00:36:34.357 09:40:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721029243 00:36:34.357 09:40:43 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721029243 00:36:34.357 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721029243_collect-vmstat.pm.log 00:36:34.357 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721029243_collect-cpu-load.pm.log 00:36:34.357 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721029243_collect-cpu-temp.pm.log 00:36:34.357 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721029243_collect-bmc-pm.bmc.pm.log 00:36:35.288 09:40:44 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:36:35.288 09:40:44 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:36:35.288 09:40:44 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:35.288 09:40:44 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:35.288 09:40:44 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:35.288 09:40:44 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:35.288 09:40:44 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:35.288 09:40:44 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:35.288 09:40:44 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:35.288 09:40:44 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:35.288 09:40:44 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:35.288 09:40:44 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:35.288 09:40:44 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:35.288 09:40:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:35.289 09:40:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:35.289 09:40:44 -- pm/common@44 -- $ pid=311373 00:36:35.289 09:40:44 -- pm/common@50 -- $ kill -TERM 311373 00:36:35.289 09:40:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:35.289 09:40:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:35.289 09:40:44 -- pm/common@44 -- $ pid=311375 00:36:35.289 09:40:44 -- pm/common@50 -- $ kill -TERM 311375 00:36:35.289 09:40:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:35.289 09:40:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:35.289 09:40:44 -- pm/common@44 -- $ pid=311378 00:36:35.289 09:40:44 -- pm/common@50 -- $ kill -TERM 311378 00:36:35.289 09:40:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:35.289 09:40:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:35.289 09:40:44 -- pm/common@44 -- $ pid=311401 00:36:35.289 09:40:44 -- pm/common@50 -- $ sudo -E kill -TERM 311401 00:36:35.289 + [[ -n 4121066 ]] 00:36:35.289 + sudo kill 4121066 00:36:35.296 [Pipeline] } 00:36:35.317 [Pipeline] // stage 00:36:35.322 [Pipeline] } 00:36:35.340 [Pipeline] // timeout 00:36:35.343 [Pipeline] } 00:36:35.357 [Pipeline] // catchError 00:36:35.362 [Pipeline] } 00:36:35.377 [Pipeline] // wrap 00:36:35.382 [Pipeline] } 00:36:35.395 [Pipeline] // catchError 00:36:35.403 [Pipeline] stage 00:36:35.405 [Pipeline] { (Epilogue) 00:36:35.420 [Pipeline] catchError 00:36:35.422 [Pipeline] { 00:36:35.439 [Pipeline] echo 00:36:35.441 Cleanup processes 00:36:35.449 [Pipeline] sh 00:36:35.730 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:35.730 311479 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:35.730 311694 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:35.746 [Pipeline] sh 00:36:36.024 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:36.024 ++ grep -v 'sudo pgrep' 00:36:36.024 ++ awk '{print $1}' 00:36:36.024 + sudo kill -9 311479 00:36:36.036 [Pipeline] sh 00:36:36.373 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:36:48.589 [Pipeline] sh 00:36:48.870 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:36:48.870 Artifacts sizes are good 00:36:48.889 [Pipeline] archiveArtifacts 00:36:48.898 Archiving artifacts 00:36:49.046 [Pipeline] sh 00:36:49.336 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:36:49.351 [Pipeline] cleanWs 00:36:49.361 [WS-CLEANUP] Deleting project workspace... 00:36:49.361 [WS-CLEANUP] Deferred wipeout is used... 00:36:49.368 [WS-CLEANUP] done 00:36:49.369 [Pipeline] } 00:36:49.391 [Pipeline] // catchError 00:36:49.404 [Pipeline] sh 00:36:49.682 + logger -p user.info -t JENKINS-CI 00:36:49.691 [Pipeline] } 00:36:49.709 [Pipeline] // stage 00:36:49.715 [Pipeline] } 00:36:49.733 [Pipeline] // node 00:36:49.741 [Pipeline] End of Pipeline 00:36:49.774 Finished: SUCCESS